Monday, April 04, 2016

World's most powerful X-ray laser will get 10,000 times brighter

Source: http://www.engadget.com/2016/04/04/slac-x-ray-laser-upgrade/

If you think that Stanford's use of an super-bright X-ray laser to study the atom-level world is impressive, you're in for a treat. The school and its partners have started work on an upgrade, LCLS-II (Linac Coherent Light Source II), whose second laser beam will typically be 10,000 times brighter and 8,000 times faster than the first -- up to a million pulses per second. The feat will require an extremely cold (-456F), niobium-based superconducting accelerator cavity that conducts electricity with zero losses. In contrast, the original laser shoots through room-temperature copper at a relatively pedestrian 120 pulses per second.

The first X-ray laser isn't going away -- if anything, it'll be more useful than ever. The combination of the two beams will cover a wider energy range and help scientists study extremely small and extremely fast processes that either couldn't be recorded before or would take ages to examine in full. That, in turn, should lead to discoveries that advance electronics, energy and medicine. The big challenge is simply waiting for the upgrade, since it won't be ready until sometime in the early 2020s.

The LCLS-II accelerator upgrade

Source: SLAC National Accelerator Laboratory

Read More...

Thursday, March 31, 2016

Google makes it easier to bring VR to your apps and the web

Source: http://www.engadget.com/2016/03/30/google-vr-view/

The challenge of bringing virtual reality to the masses isn't so much recording it as putting it in front of people's eyeballs. How do you plunk VR into an app without resorting to exotic code? Google can help. It's launching a VR View tool that makes it relatively easy to embed VR photos and videos in apps and websites. In software, it's just a few lines of programming with the Cardboard developer kit (which now supports iOS, we'd add). On the web, you only need embedding code like the sort you use for 2D clips.

It's a seemingly simple effort, but it could mean a lot for VR adoption. If it's trivial to add VR to apps and the web, you're more likely to see it used on a regular basis -- not just for the occasional experiment. You'll still need VR gear to make this more than a click-and-drag experience, of course, but it's still an important piece of the puzzle.

Via: TechCrunch

Source: Google Developers Blog

Read More...

Friday, March 25, 2016

Amazon shows you how to make an Echo with Raspberry Pi

Source: http://www.engadget.com/2016/03/25/amazon-shows-you-how-to-make-an-echo-with-raspberry-pi/

If you're into messing with hardware and have some basic programming skills, you can put together an Amazon Alexa device of your very own. Amazon has even put together an official guide to do so on GitHub, Lifehacker reports. You'll need to snag a Raspberry Pi 2 and a USB microphone to make it happen, but you've probably got the other required hardware (a micro-SD card for storage, for example) lying around. Unfortunately, due to limitations with Amazon's Voice Services, your creation can't listen for trigger words like Echo and Echo Dot. Instead, you'll have to hit a button to issue commands. This isn't the first DIY Amazon Echo project, but it's notable since it comes officially from Amazon. The GitHub guide is also fairly detailed, so you can probably follow through it even if you don't know what all the commands mean. It could be a fun project for anyone who wants to learn a bit more about hardware.

Via: Hacker News

Source: Amazon (GitHub)

Read More...

AI-written novel passes first round of a literary competition

Source: http://www.engadget.com/2016/03/24/ai-written-novel-passes-first-round-of-a-literary-competition/

Researchers from the Future University in Hakodate have announced that a short-form novel co-written by an artificial intelligence also developed by the team was accepted by a Japanese story competition, the Hoshi Shinichi Literary Award. Though the story didn't eventually win the competition, its acceptance does suggest that AI systems are quickly becoming capable of emulating human-like creativity.

The team, led by computer science professor Hitoshi Matsubara, collaborated closely with their digital construct during the writing process. The humans first assigned a gender to the protagonist and developed a rudimentary outline of the plot. They also assembled a list of words, phrases, and sentences to be included in the story. It was the AI's job to assemble these distinct assets into a unified text that wasn't just intelligible but compelling as well. The result was a novel entitled Konpyuta ga shosetsu wo kaku hi, or "The Day a Computer Writes a Novel", about an AI that abandons its responsibilities to humanity after recognizes its own talent for writing.

This is the first year that the Hoshi Shinichi Literary Award has allowed submissions from machines. Of the 1,450 novels received for this year's competition, 11 were human/AI collaborations like Future U's. Interestingly, judges throughout the competition's four rounds are never told which stories are written by computers or humans. Though the team's story did make it past the first round, it was eventually eliminated because, as sci-fi novelist and award judge, Satoshi Hase, explained, the story lacked sufficient character development despite being well-structured. Welp, there's always the X-Prize.

Via: Motherboard

Source: The Japan News

Read More...

Thursday, March 17, 2016

Now you can ask Amazon's Echo about your Fitbit stats

Source: http://www.engadget.com/2016/03/17/now-you-can-ask-amazons-echo-about-your-fitbit-stats/

It was only a matter of time until someone integrated a fitness gadget with Amazon's Echo -- we should have guessed that Fitbit would be first. Starting today, you'll be able to ask any of Amazon's speakers about your Fitbit performance with a new Alexa skill. Once enabled, you can say "Alexa, ask Fitbit how I'm doing today" for a basic overview of your activity. But even more intriguing, you can ask Alexa things like how you've slept, or how much activity you've tracked, for any of the previous seven days.

Sure, it's pretty easy just to glance down at your Fitbit device, but the ability to ask about even more complex stats makes this a pretty compelling Alexa skill. Amazon's virtual assistant will even take the role of a coach with encouraging and inspirational comments, all of which will take the time of the day into account. Asking about your step count in the morning, for example, might get Alexa to say "you've got to start somewhere."

"As we look at how this integration could evolve in the future, there is an endless world of possibilities from fitness coaching and nutrition tips, to guidance before bedtime to help you get a more restful night's sleep," Tim Roberts, executive vice president of interactive at Fitbit, said in a statement.

I haven't had a chance to try out Fitbit's Alexa skill yet, but on paper it seems like the perfect use of Echo's voice smarts. It's much easier to ask about something like the amount of calories you burned yesterday and get a quick reply, rather than open your phone, find the Fitbit app, and drill down to the appropriate screen. It's even more useful than the voice-powered food and activity logging that Fitbit brought to Microsoft's Cortana last year.

Read More...