Monday, May 17, 2010

Samsung's EX1 / TL500 flagship compact articulates 'release' in Korean

Source: http://www.engadget.com/2010/05/17/samsungs-ex1-tl500-flagship-compact-camera-articulates-a-kore/

It's out, Samsung's 10 megapixel EX1 (aka, TL500) with 3-inch articulating AMOLED display was just pushed out for retail in its Korean homeland. For 599,000KRW or about $400, you get a F1.8 24mm ultra-wide lens, 1/1.7-inch CCD, refined DRIMeIII imaging processor, dual image stabilization, and ISO 3200 max sensitivity (at full resolution) that should combine to deliver decent shots (for a compact) in low-light conditions without using a flash. As Samsung's flagship compact it also supports RAW with shutter‑priority, aperture‑priority and full-manual shooting modes. Unfortunately, H.264 video is limited to 640x480 pixels at 30fps. Fortunately, an optional optical viewfinder can be fitted to the hot shoe in case the AMOLED display fails to hold up under direct sunlight -- a very real possibility since there's no mention of Samsung's "Super AMOLED" anywhere in the press release. Can't wait to see the reviews on this pup.

Samsung's EX1 / TL500 flagship compact articulates 'release' in Korean originally appeared on Engadget on Mon, 17 May 2010 05:40:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceSamsung  | Email this | Comments

Read More...

NVIDIA puts its Tegra 2 eggs in Android's basket, aims to topple Apple's A4

Source: http://www.engadget.com/2010/05/17/nvidia-puts-its-tegra-2-eggs-in-androids-basket-aims-to-topple/

Microsoft's Kin One and Kin Two might not turn out to be the most auspicious devices for Tegra's debut in the smartphone arena, but NVIDIA seems to be learning from its mistakes. Admitting that the company committed too strongly to Microsoft with the first-gen iteration, Jen-Hsun Huang has now said that the second generation of Tegra will look to Android devices first and foremost. This newfound focus will materialize with both smartphones and tablets in the third and fourth quarter of this year, and will, according to Jen-Hsun, offer device makers a viable competitor to Apple's A4 SOC. In other news, NVIDIA has now shipped "a few hundred thousand" Fermi cards, and has also achieved 70 design wins with its Optimus graphics switching technology. Eleven of those are now out in the wild, but the vast majority are still to come, mostly as part of the seasonal "back to school" refresh at the end of the summer. These revelations came during the company's earnings call for the first quarter of its 2011 fiscal year, and you can find the full transcript at the source below.

[Thanks, TareG]

NVIDIA puts its Tegra 2 eggs in Android's basket, aims to topple Apple's A4 originally appeared on Engadget on Mon, 17 May 2010 04:20:00 EST. Please see our terms for use of feeds.

Permalink Hexus  |  sourceSeeking Alpha  | Email this | Comments

Read More...

HTC Mondrian with 1.3GHz Snapdragon detailed in leaked Windows Phone 7 ROM?

Source: http://www.engadget.com/2010/05/17/htc-mondrian-with-1-3ghz-snapdragon-detailed-in-leaked-windows-p/

As expected, the official-looking Windows Phone 7 OS ROM leaked over the weekend is already yielding results. Pictured above is an image extracted from the "oemavatar.cab." Now that could be a generic Windows Phone 7 image or it could be the HTC Mondrian already seen referenced by the 100MB file. The kids at XDA-Developers have also pieced together specs from an ongoing analysis of the registry and RGU files. So far they've spotted references to a 4.3-inch WVGA (480x800) display from Optrex and a 1.3GHz QSD8650A/B Snapdragon from Qualcomm -- a chipset, you might recall, supporting multi-mode UMTS and CDMA 3G connectivity. It's also packing a digital compass but seems to lack a keyboard. Mind you, none of this is absolute but it's very very intriguing.

[Thanks, Andrew]

HTC Mondrian with 1.3GHz Snapdragon detailed in leaked Windows Phone 7 ROM? originally appeared on Engadget on Mon, 17 May 2010 05:51:00 EST. Please see our terms for use of feeds.

Permalink   |  sourceXDA-Developers  | Email this | Comments

Read More...

Saturday, May 15, 2010

How YouTube 3D Came to Be [Q&A]

Source: http://gizmodo.com/5536385/how-youtube-3d-came-to-be

About a year ago, YouTube made a quiet upgrade—it began to support 3D content. But the even neater thing? The work was essentially that of one employee who worked on the project in his spare "20%" time.

It's just so Google, isn't it? Pete Bradshaw, YouTube software engineer, playing around in 3D in his allotted dabbling time, sparks an update in the world's most popular video sharing service.

You may not have even noticed the YouTube was supporting 3D—frankly, before this interview, I had no idea either. But from red and blue anaglyph to eyes-crossing Magic-Eye-style, the service now supports the uploading of stereoscopic footage (two video streams) that it will mix, in real time, right within your browser in a manner of the viewer's choosing.

(Note: To toggle the different ways you can view these embedded videos in 3D, you'll need to view them on YouTube, where they'll be equipped with a 3D pulldown menu.)
How YouTube 3D Came to Be
I chatted with Pete, along with spokesman Chris Dale, about how YouTube 3D came about and where YouTube will take 3D into the future.

Why did you begin the project?

Pete: The germ of the idea came about with the Superbowl a couple of years ago when there was a promotion with Monsters vs Aliens, and they were giving away red and blue glasses out in the supermarkets. And I got those glasses since they were supposed to work with YouTube.

So I went digging for 3D content on the site. And there was a lot, but the issue was that sometimes it was mixed with different colors (because you can get different colored glasses). There's red/green and yellow/blue and all these other things. So if you were uploading 3D video, you basically planned for one specific kind of glasses you wanted to support. If you didn't have just the right pair, you were out of luck.

We were just kind of sitting around talking about this, and we came up with the idea that, well, we could mix the left and right views inside the player and give an experience that works on any of the different-colored 3D glasses.

This just started as like a random lunch discussion, and afterward, I hacked up a simple little demo that worked on its own machine. I showed it to a few people and they were very surprised. They were like, "We should launch it! When does it launch?"

And so 3D was integrated into the player in a way users could actually use. The shooter uploads two videos side by side, and then we do the mix in the player. Then the viewer tells us the color of their glasses, or if they'd rather do some of the crazy, cross eyed things. (We actually added some support for a few more display systems after launch.)

How long was it between the germ of the idea and actually having something running and actually launching?

Pete: From the first demo, I probably spent 3 weeks of actual work—not all of it continuous because, given my background, I'm more on the backend server side of stuff here at YouTube than the player side. So there was a bit of a learning curve for me coming into that, meaning I grabbed people at lunch, asking them, "Hey, how did you guys build this stuff?"

It seems like Nvidia is making a huge push with all their shutter glasses. Do you guys think that shutter is the next big step?

Pete: Well I don't know if you saw the demo at CES where Nvidia had their big 3D press event—at the end of it, they worked with Adobe and us to actually get the shot of us working with YouTube to get a YouTube video to play with the shutter glasses, and it just works.

It's not actually launched yet, but most of the hard work is on their side, and getting Adobe to talk to the shutter people. But once that's done, it's definitely something we are interested in supporting.

Do you really see 3D glasses taking off?

Pete: I definitely think 3D is coming, and it's going to be a standard feature. But adoption rates and exact technologies, if I knew that stuff, I would be investing.

YouTube can be difficult enough to run in HD. It feel like by adding 3D, you're basically doubling the information were you to compete with, say, 60fps Blu-ray 3D.

Pete: I take the point that it is a heavier burden for the machine to show. We've got some player changes to come and plans that will help with that, but there's also a lot of interest from Adobe and also HTML5 guys in making this kind of stuff work.

Chris: Occasionally people say, "god, how do you guy support the infrastructure cost and this other kind of stuff?" I don't think it's really something we worry about too much, but as far as 3D, I think it really depends on the users computer to a great degree in like computing power catching up to where video is evolving to.

What's the endgame of where you're going with this in terms of this 3D adoption?

Pete: It's been used a lot from the start, but I'm not sure where it's going because users have done all sorts of crazy stuff—like there was the craze where people were getting a bunch of LEDs and doing long exposures. There were also a lot of videos with those Fuji FinePix REAL 3D W1 cameras, along with a lot of users just uploading random stuff with that.

It's kind of fun with because a lot of the current stuff is more YouTube-y. Instead of being this blockbuster or some guy working a CGI animation, it's just like, "hey, here is my garden," as some guy in Japan films his garden in 3D.

Another thing that happened that was just a complete surprise—these guys were using the stereo video technology for surgery—a kind of keyhole surgery. And until now, one surgeon would perform a procedure with a stereo microscope, and all the students just watched the back of this guy's head. Now, they've got all the cameras and the HD video. We just have one super short, 30 second clip of brain surgery, and it's kind of gross...but it's great.

Is it right to say you guys aren't really trying to know where this is going other just kind of saying that we're going to support 3D? Like we don't know where exactly this is going to go, how the camera and stuff are going to work out but YouTube is going to support it.

Pete: We do think about it and maintain an interest, but every time we do the users have started doing something crazy and different. I actually remember one interesting example at the launch was a guy whose doing his pHD in some kind of visualization and he started asking these great questions about how we are actually mixing them together. Even after the feature has launched now there is still a feed back loop by users, coming to us about it and shoot us some ideas we could improve.

Chris: We also have content partners with many of the major Hollywood studios, television networks and content creators around the world. A lot of them have seen thi,s and they want to tackle it and they want to think of ways of showcasing their movie trailers in 3D. We've even been asking, could we even live stream something in 3D?

However, I can't get into specifics about what movie studios or what trailers you could conceivably see.

What portion of YouTube uploaders are really doing stuff with 3D?

Chris: It could be probably be pretty small, I mean we're talking—we have thousands of 3D videos on the site, but we have 24 hours of video uploaded every minute.

So it's essentially thousands vs countless.

Chris: Exactly. I think, it is still very small but it's growing and it's growing fast. Like when we first did mobile uploads, they started to trickle in, but when the iPhone 3GS came out, it grew by, I think 500% in the weeks following the 3Gs release. Over the course of 2009 mobile uploads were up by 2000%.

Is there any interest taking this 3D tech beyond video at this point—like maybe Picasa?

Chris: One of the things we've both learned at YouTube is that never say never, and the truth is that the cross-pollination across different Google services and properties have accelerated significantly over the last year...We are really in the amazing stage of 3D right now and i think you can expect more cool things from a lot of different companies including Google when it comes to 3D.

Read More...

Intel promises to bring wireless display technology to other mobile devices

Source: http://www.engadget.com/2010/05/15/intel-promises-to-bring-wireless-display-technology-to-other-mob/

Details are unfortunately light on this one, but Intel has closed out the week with one interesting tidbit of news -- it's apparently planning to bring its wireless display technology (a.k.a. WiDi) to netbooks, tablets and other mobile devices. That word comes straight from Intel wireless display product manager Kerry Forrell, who says that "we fully expect to take the technology there," but that he can't yet provide a specific time frame. Those plans are further backed up by Intel CEO Paul Otellini himself, who told investors this week that "what we'll be doing over the next few years is take the Wi-Di capability that's in the laptop today and extend that into all the Intel platforms." Intel doesn't even seem to be stopping there, however, with Forrell further adding that the company even sees the technology being built into to TVs "over time."

Intel promises to bring wireless display technology to other mobile devices originally appeared on Engadget on Sat, 15 May 2010 02:36:00 EST. Please see our terms for use of feeds.

Permalink   |  sourcePC World  | Email this | Comments

Read More...