Tuesday, April 05, 2016

Panasonic's Lumix GX85 is a compact camera that packs a punch

Source: http://www.engadget.com/2016/04/05/panasonic-lumix-gx85/

The Lumix series is expanding with the GX85, an interchangeable lens mirrorless camera featuring a compact body and impressive specs. Panasonic says this shooter combines the best of its GX8 and GX7, but with some improvements over both. For starters, the Lumix GX85 sports a 16-megapixel Live MOS sensor and a new Venus Engine processor, along with a max ISO of 25,600, WiFi, up to 8-fps continuos shooting and in-camera image stabilization. Panasonic's also eliminated the low-pass filter, which should help you capture sharp and color-accurate pictures.

Not surprisingly, given how Panasonic has been a big proponent of 4K, the GX85 also records Ultra HD (3,849 x 2,160) videos at 24 and 30 fps, as well as 1080p at 60 fps. And if you're familiar with the Lytro camera, you'll probably like playing around with Panasonic's Post Focus function. So how does that work? The GX85 uses 49 areas from its autofocus system, near or far, to record every single focal point and, after you take a shot, you tap anywhere on the 3-inch screen to choose your preferred focus area. That means you could end up having 49 different pictures.

Panasonic's Lumix GX85 is coming to the US in mid-May for $800, which includes a 12-32mm kit lens and your choice of a black or silver model.

Read More...

Monday, April 04, 2016

World's most powerful X-ray laser will get 10,000 times brighter

Source: http://www.engadget.com/2016/04/04/slac-x-ray-laser-upgrade/

If you think that Stanford's use of an super-bright X-ray laser to study the atom-level world is impressive, you're in for a treat. The school and its partners have started work on an upgrade, LCLS-II (Linac Coherent Light Source II), whose second laser beam will typically be 10,000 times brighter and 8,000 times faster than the first -- up to a million pulses per second. The feat will require an extremely cold (-456F), niobium-based superconducting accelerator cavity that conducts electricity with zero losses. In contrast, the original laser shoots through room-temperature copper at a relatively pedestrian 120 pulses per second.

The first X-ray laser isn't going away -- if anything, it'll be more useful than ever. The combination of the two beams will cover a wider energy range and help scientists study extremely small and extremely fast processes that either couldn't be recorded before or would take ages to examine in full. That, in turn, should lead to discoveries that advance electronics, energy and medicine. The big challenge is simply waiting for the upgrade, since it won't be ready until sometime in the early 2020s.

The LCLS-II accelerator upgrade

Source: SLAC National Accelerator Laboratory

Read More...

Thursday, March 31, 2016

Google makes it easier to bring VR to your apps and the web

Source: http://www.engadget.com/2016/03/30/google-vr-view/

The challenge of bringing virtual reality to the masses isn't so much recording it as putting it in front of people's eyeballs. How do you plunk VR into an app without resorting to exotic code? Google can help. It's launching a VR View tool that makes it relatively easy to embed VR photos and videos in apps and websites. In software, it's just a few lines of programming with the Cardboard developer kit (which now supports iOS, we'd add). On the web, you only need embedding code like the sort you use for 2D clips.

It's a seemingly simple effort, but it could mean a lot for VR adoption. If it's trivial to add VR to apps and the web, you're more likely to see it used on a regular basis -- not just for the occasional experiment. You'll still need VR gear to make this more than a click-and-drag experience, of course, but it's still an important piece of the puzzle.

Via: TechCrunch

Source: Google Developers Blog

Read More...

Friday, March 25, 2016

Amazon shows you how to make an Echo with Raspberry Pi

Source: http://www.engadget.com/2016/03/25/amazon-shows-you-how-to-make-an-echo-with-raspberry-pi/

If you're into messing with hardware and have some basic programming skills, you can put together an Amazon Alexa device of your very own. Amazon has even put together an official guide to do so on GitHub, Lifehacker reports. You'll need to snag a Raspberry Pi 2 and a USB microphone to make it happen, but you've probably got the other required hardware (a micro-SD card for storage, for example) lying around. Unfortunately, due to limitations with Amazon's Voice Services, your creation can't listen for trigger words like Echo and Echo Dot. Instead, you'll have to hit a button to issue commands. This isn't the first DIY Amazon Echo project, but it's notable since it comes officially from Amazon. The GitHub guide is also fairly detailed, so you can probably follow through it even if you don't know what all the commands mean. It could be a fun project for anyone who wants to learn a bit more about hardware.

Via: Hacker News

Source: Amazon (GitHub)

Read More...

AI-written novel passes first round of a literary competition

Source: http://www.engadget.com/2016/03/24/ai-written-novel-passes-first-round-of-a-literary-competition/

Researchers from the Future University in Hakodate have announced that a short-form novel co-written by an artificial intelligence also developed by the team was accepted by a Japanese story competition, the Hoshi Shinichi Literary Award. Though the story didn't eventually win the competition, its acceptance does suggest that AI systems are quickly becoming capable of emulating human-like creativity.

The team, led by computer science professor Hitoshi Matsubara, collaborated closely with their digital construct during the writing process. The humans first assigned a gender to the protagonist and developed a rudimentary outline of the plot. They also assembled a list of words, phrases, and sentences to be included in the story. It was the AI's job to assemble these distinct assets into a unified text that wasn't just intelligible but compelling as well. The result was a novel entitled Konpyuta ga shosetsu wo kaku hi, or "The Day a Computer Writes a Novel", about an AI that abandons its responsibilities to humanity after recognizes its own talent for writing.

This is the first year that the Hoshi Shinichi Literary Award has allowed submissions from machines. Of the 1,450 novels received for this year's competition, 11 were human/AI collaborations like Future U's. Interestingly, judges throughout the competition's four rounds are never told which stories are written by computers or humans. Though the team's story did make it past the first round, it was eventually eliminated because, as sci-fi novelist and award judge, Satoshi Hase, explained, the story lacked sufficient character development despite being well-structured. Welp, there's always the X-Prize.

Via: Motherboard

Source: The Japan News

Read More...