Thursday, September 11, 2008

Giz Explains: Why HD Video Downloads Aren't Very High Def [Giz Explains]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/388906550/giz-explains-why-hd-video-downloads-arent-very-high-def

Yesterday Apple introduced HD TV downloads to the iTunes store, meaning you can watch Peter be super emo on Heroes at a crispy 720p resolution. That's a higher resolution than DVD, and technically, yup, that's HD. There's a catch though. Like every other video download service touting HD videos, it's all actually lower quality than DVD.

It's all about bitrate: How much data is packed into a file, described as bits per second. Generally speaking, a higher bitrate translates into higher quality audio and video, though quality can also be affected by codec—the encoding and compression technique that was used to make and read the file—so bitrate is not an absolute mark of quality, but it's still a very good indicator.

You're probably most familiar with this bitrate business when it comes to ripping your CDs. When you shove a CD into your computer, your ripping program will ask what format you want and what bitrate you want. A song ripped at a higher bitrate will sound better, with more presence and detail, but it does take up more space.

The same principle applies with video, though it's actually a bigger deal, because it's easier to see quality differences in video than it is to hear differences in audio. The bits make a huge difference when you get into fast moving stuff like sports or action movies—to be frank, they'll look like splattered, smeared shit in highly compressed low-bitrate vids. This chart below, expertly crafted by George Ou at ZDNet provides a solid starting point for comparison, with average bitrates of most digital video available.

As you can see, regular DVD runs at about 6-8 megabits per second. High-def iTunes content, despite having a higher resolution, is half that, a mere 4Mbps. Vudu's current HD movies is also about 4Mbps, if you've got the pipes. Xbox Live Marketplace has the highest bitrate—and indeed, often gets props for its quality—at close to 6.8Mbps. On the other hand, standard-def movies on the Netflix Roku box max out at around 2.2Mbps—and are often delivered in lower qualities because of bandwidth constraints. iTunes standard def TV shows run around 1.5Mbps. Now, consider that Blu-ray is a mean 40Mbps and you see that the definition of "HD" is suddenly remarkably vague.

That's a pissload of numbers. What does that mean?

This comparison test we ran in February pretty much shows you what's wrong: No matter how awesome MPEG-4 compression—or whatever the codec of the month is—gets, it can't work miracles when it's missing bits. It's why Vudu, for instance, is testing out a new closer-to-real-HD service—that they've revealed to us has three times the bitrate of any other download service on the market, meaning it should be close to 20Mbps—that will take hours to deliver to your home. But even then, the notion that it would truly rival Blu-ray is totally laughable.

It's not just download services giving you this watered-down so-called "HD lite", either. Comcast was busted cramming three HD channels into the space of two, resulting in crappy looking HDTV, and the satellite guys adding a million HD channels a year aren't much better.

Now that you understand what makes or breaks an HD picture—the amount of data— it's probably no surprise to you that the major reason everyone is peddling subpar HD is bandwidth. HD content is pipe-bustingly huge—a standard-def Battlestar Galactica file on iTunes is 520MB and takes about 15 minutes to download via a strong cable connection. The 720p HD download is 1.4GB and takes 40 minutes or so for your hard drive to completely swallow. The Blu-ray version of the same ep might be 10 times that—like 14GB. Putting that in more context, a single TV episode would take up twice the space as the average dual-layer DVD movie.

Right now, we don't have the broadband infrastructure to support it, and who knows when we will? Hell, the people with the best chance of giving us that added bandwidth—the major ISPs like Comcast and AT&T—are doing just the opposite: Implementing usage caps that will mean less HD downloading. The sad thing is, they probably won't even use the added bandwidth to make their own HD TV channels look better.


Read More...

Rumor: Apple MacBook Event on Oct. 14 [Rumor]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/389006266/rumor-apple-macbook-event-on-oct-14

Yesterday's new iPods were lovely and all, but if you're like me, you wanted something more. Like some notebooks. Not to worry, Daring Fireball's John Gruber says according to the standard "sources familiar with Apple's hardware plans" that its "Let's MacBook" event will happen on Oct. 14.

While he doesn't get specific what the new hardware will be, the heavily favored are new MacBooks clad in aluminum (peace out white), and new MacBook Pros, both long overdue for an overhaul. Also likely is a gut refresh of the MacBook Air, with a faster processor, and as MacRumors points out, the iPod classic's new 120GB HDD is the same kind used in the Air.

An October event also matches up with the date floated for the most pipe dreamy of all MacBook rumors, a MacBook Touch, and Apple's recent warning to retailers to stock up on current inventory. What are you hoping for? [Daring Fireball via MacRumors]


Read More...

TEAM 0.5 Microsope Takes Closest Look Ever at Graphene, the World's Strongest Known Material [Powerful Microscopes]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/389035648/team-05-microsope-takes-closest-look-ever-at-graphene-the-worlds-strongest-known-material

Graphene is getting a lot of publicity these days. It is being hailed as the future of the electronics industry—the material that will eventually replace silicon. It has also recently been confirmed as the world's strongest known material. Now, researchers at the Berkeley Lab have thrust graphene into the spotlight once again thanks to the TEAM 0.5: the world's most powerful transmission electron microscope. It has produced the first "stunning" images of graphene's individual carbon atoms.

Now, I'm no scientist, but apparently this sort of image gives even the most seasoned electron microscopist a raging science boner. But it is not so much about the graphene as it is about the potential of the TEAM 0.5. One researcher noted that it "allows for the detection of every single atom from the Periodic Table provided that the sample under investigation can stand the radiation damage." Basically, it can study individual atoms in real time and produce high-resolution images of its subject. That will allow researchers to fully realize the potential of graphene by understanding how defects in the crystal structure can effect its properties. And they claim this is only the tip of the iceberg. Noooow I feel a science boner coming on. [Nanowerk and Science Daily]


Read More...

Leica's $11,000 Noctilux 50mm f/0.95 Lens Is a Nightvision Owl Eye For Your Camera [Cameras]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/389058581/leicas-11000-noctilux-50mm-f095-lens-is-a-nightvision-owl-eye-for-your-camera

Yeah, you read that right: f/ zero point nine five. As in less than f/1, which was where Leica's legendary Noctilux was positioned before and as low as Canon goes with their 50mm f/1.0L glass, making it the world's fastest major consumer lens on the market today (f-numbers are logarithmic, so that's over a full exposure stop lower for over double the light of an f/1.4 lens). The new Noctilux was leaked by a French magazine with details of a Photokina release later this month, and it looks like it'll use Leica's standard M mount, so it will work with your M8 digital or any other M-mount camera (Epson RD-1s owners, all five of you!) to let you take pictures like this:

Yeah, that's candlelight only. Taken with the previous f/1.0 Noctilux, natch, so you could even swap it for an even smaller candle and still pull off the same shot, or try some insane depth-of-field bokeh effects. Awesome stuff, all for €8,000 ($11,260).

And now that you're in a tizzy about super-fast, super-expensive lenses, take a look at the incredible story of the custom Zeiss lens Stanley Kubrick demanded for candle-lit scenes in Barry Lyndon. It opened up to a crazy f/0.7. Well worth the read.

[Leica Rumors via Gadget Lab, Photo: lylevincent]


Read More...

Google and Friends to Bring Satellite Internets To 3 Billion People in Africa and Other Developing Markets [Google]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/389128343/google-and-friends-to-bring-satellite-internets-to-3-billion-people-in-africa-and-other-developing-markets

Today Google, along with HSBC and a few other investors, helped place an order for 16 low-orbit Thales Alenia satellites to begin the push for a massive broadband deployment in Africa and beyond that it hopes will help connect the 3 billion people in the world who are currently webless. It's a noble plan, with quite a long ways to go.

Google and their partners threw in $60 million out of the required $150-$180m into the kitty of O3b Networks (the other 3 billion, get it?), the firm established to launch the satellites and manage the initiative. A satellite downlink is of course only the first step in setting up a fresh broadband network, but the company also has plans to convert mobile phone towers into multipurpose high-speed network nodes, which when complete is estimated to cost $750m all told. Google says it will help drop the price of broadband by up to 95% in some places where it's a rare commodity. When the satellites are launched in 2010, that's one step closer to 3 billion more new Googlers, looking at AdSense ads all the way, of course. [Financial Times via /.]


Read More...

Crysis Warhead Ultra Optimized PC Comes with Face-Melting Specs... for $700? [PCs]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/389163372/crysis-warhead-ultra-optimized-pc-comes-with-face+melting-specs-for-700

The guys at GameCyte are keyed up to try out Crysis Warhead on the Optimized PC, a Core 2 Duo E7300, GeForce 9800GT system built by Ultra and vetted by game developer Crytek to bring the game fully to life (and death). The clincher: It only costs $700. Since the GameCyte guys thought this was too good to be true, they started asking Ultra some uncomfortable questions.

Fearing that the system came as a bag of components, they were relieved to hear that it was actually a fully built and tested system. Fearing a white-box scenario where you have to add on your own OS, they again were happy to hear it comes with Windows XP Pro installed with the latest service pack—though it doesn't appear the game comes in the bundle. Ultra claims that the Optimized PC will run Crysis Warhead at the highest DX9 setting, at 30 frames per second, and that the game was actually "fine tuned" to work with Nvidia 9800 GT video card.

Pre-orders at TigerDirect.com start next week. Sounds like a sweet deal to me—even if you still have to buy the game and a monitor and speakers—but give me your thoughts... You buyin' this? [GameCyte]


Read More...

iPod Touch v2 Secretly Has Bluetooth, But Will Apple Enable It? [Ipod Touch Version 2]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/389175718/ipod-touch-v2-secretly-has-bluetooth-but-will-apple-enable-it

In their teardown of the iPod touch version 2, iFixit found a secret surprise: A Broadcom Bluetooth chipset! Though totally unannounced and not listed on the spec page (Apple says Nike+ doesn't use Bluetooth), the iPod touch's Bluetooth chipset supports 2.1+EDR. We don't know for sure yet if it has A2DP, which would let you use stereo headphones—and be another hardware one-up over the iPhone besides ">the built-in Nike+ functions. We've got our fingers crossed—what else would it be used for? Update: MacRumors notes that Nike+ uses the same 2.4GHz frequency as Bluetooth, so that might be what's going on here. [iFixit]


Read More...