Sunday, February 21, 2016

Samsung has a 360-degree camera for Gear VR video

Source: http://www.engadget.com/2016/02/21/samsung-has-a-360-degree-camera-for-gear-vr-video/

Along with the expected Galaxy S7 and Galaxy S7 Edge, Samsung is also taking the wraps off the a 360-degree video camera, the Gear 360. It's built around two 15-megapixel sensors, each nestled behind a fisheye lens, and also has a tiny 0.5-inch display. You can use it handheld with the included handle, or set it down on a surface with a mini-tripod. In the latter mode, it looks kind of like a Portal Turret had a baby with an old Logitech webcam. In the best way possible. Both the handle and the tripod screw into an industry-standard threaded port (the same that probably graces the underside of your camera), so you can always bring your own accessories to the party.

Although it's difficult to rate the resolution of 360-degree video, Samsung says it'll capture 3,840 x 1,920 video at 30 frames per second. That's just a few vertical pixels shy of 4K. Still images are far larger: 7,776 x 3,888, or 30-megapixels. There's no on-board storage, but it supports MicroSD cards up to 128GB in capacity.

While the Gear 360 will output plain MP4s or JPEGs, it's really designed to allow anyone to create video for the company's Gear VR headset. You can sync the camera with a Galaxy S7 or S7 Edge for remote-control features, and you'll also be able to preview footage in real-time on your phone screen. Any videos you take on the 360 will be able to be viewed, stitched, and saved directly to a smartphone.

We got to play with the Gear 360 very briefly at a meeting in New York, and while the thing may look like a video game tchotchke, its ease of use is its biggest asset. Popping in the battery and a memory card (just in case) took seconds, and so did pressing the button on top of the sphere to start it all up. After that, you're more-or-less meant to forget about it — Samsung's aiming to capture more meaningful slices of life, ones that wouldn't normally by shot by professional VR rigs, so off-the-cuff usage is encouraged. We even managed to get a short 360 video loaded onto a Galaxy S7 Edge for a bit of auto-stitching — which is faster than it sounds — and wound up with a perfectly serviceable slice of VR. We don't have a price or an exact release date for the Gear 360 yet, but it'll be available in "select countries" at some point between April 1st and June 30th.

Read More...

Lenovo adds more mid- and low-end options to laptop range

Source: http://www.engadget.com/2016/02/21/lenovo-yoga-flex-miix-mwc-announcements/

Lenovo has a bunch of new Windows 10 machines to show off at MWC this year, and if you're familiar with the company's Yoga and Miix lines, they'll seem very familiar.

First up is the ultraportable Yoga 710, which comes in 11- and 14-inch sizes. Both have 1080p IPS touch screens, up to 8GB of RAM and up to 256GB SSD storage. The smaller has a choice of Intel Core m processors (up to m5) and integrated Intel graphics, while the larger utilizes 6th-generation Intel Core i processors (up to i7), and up to Nvidia GeForce 940MX graphics. Like all Yogas, the 710's keyboard rotates a full 360 degrees, giving you a choice of laptop mode, stand mode, tent mode, or tablet mode. The 11-inch model starts at $499, while the 14-inch will cost $799. They'll both go on sale this May.

The Flex 4 (called the Yoga 510 internationally) will be available in 14- and 15-inch configurations. It keeps the general Yoga aesthetic, the 1080p displays, the up-to Core i7, 8GB of RAM and 256GB SSD options, but its graphics cap out at an AMD Radeon R7 M460 GPU. It's scheduled for release this April at $599 for the 14-inch, or $699 for the 15-inch.

Lenovo's Miix 310 tablet.

For the super budget-conscious, Lenovo has the MIiix 310, a $229 convertible powered by an Intel Atom X5 8300 CPU with integrated graphics. It has a 10.1-inch "up to 1080p" display, "up to" 4GB of RAM, and "up to" 128GB eMMC storage. There'll also be a model with LTE support, but Lenovo's quiet on the price for that configuration. We suspect the exact pricing will become clearer closer to its release date in June.

Read More...

Saturday, February 13, 2016

13 jobs that are quickly disappearing thanks to robots

Source: http://www.businessinsider.com/jobs-that-are-quickly-disappearing-thanks-to-robots-2016-2

Robotic armsChristopher Furlong / Getty Images

Thanks in part to automated mail sorting systems, postal workers may be all but obsolete in the not-so-distant future.

By 2024, the US Bureau of Labor Statistics projects a 28% decline in postal-service jobs, totaling around 136,000 fewer positions than 2014.

Mail carriers and processors aren't the only ones whose jobs are disappearing thanks to robots.

Automation technologies that conduct physical, intellectual, or customer service tasks are affecting a variety of fields, most notably metal and plastic machine workers.

Based on the BLS's occupational outlook data, here are 13 jobs that could be on their way out of the US thanks to robots:

US Department of Agriculture

13. Forging Machine Setters, Operators, and Tenders, Metal and Plastic

According to the BLS, they set up, operate, or tend forging machines to taper, shape, or form metal or plastic parts.

Median annual pay: $35,480

Number of people who held this job in the US in 2014: 21,600 

Predicted number of people who will hold this job in 2024: 17,000

Projected decline: 21.5%

Why it's declining: According to the BLS, one of the most important factors influencing employment of manual machine setters, operators, and tenders is the high adoption of labor-saving machinery like computer numerically controlled (CNC) machine tools and robots to improve quality and lower production costs. 



12. Grinding, lapping, polishing, and buffing machine tool setters, operators, and tenders (metal and plastic)

According to the BLS, they set up, operate, or tend grinding and related tools that remove excess material or burrs from surfaces, sharpen edges or corners, or buff, hone, or polish metal or plastic work pieces.

Median annual pay: $34,150

Number of people who held this job in the US in 2014: 71,400

Predicted number of people who will hold this job in 2024: 55,800

Projected decline: 21.9%

Why it's declining: According to the BLS, one of the most important factors influencing employment of manual machine setters, operators, and tenders is the high adoption of labor-saving machinery like computer numerically controlled (CNC) machine tools and robots to improve quality and lower production costs. 



11. Patternmakers (metal and plastic)

According to the BLS, they lay out, machine, fit, and assemble castings and parts to metal or plastic foundry patterns, core boxes, or match plates.

Median annual pay: $41,670

Number of people who held this job in the US in 2014: 3,800

Predicted number of people who will hold this job in 2024: 2,900

Projected decline: 23.4%

Why it's declining: According to the BLS, one of the most important factors influencing employment of manual machine setters, operators, and tenders is the high adoption of labor-saving machinery like computer numerically controlled (CNC) machine tools and robots to improve quality and lower production costs. 



See the rest of the story at Business Insider

Read More...

Friday, February 12, 2016

This New App Turns Your Phone Into a Portable Seismic Station

Source: http://gizmodo.com/this-new-app-turns-your-phone-into-a-portable-seismic-s-1758712546

Whoa, did you feel that earthquake? Even if you didn’t, your phone did, and a new app from seismologists aims to capture those vibrations in your very own pocket seismology lab.

Read more...











Read More...

UCLA open sources image detector that can see what we can't

Source: http://www.engadget.com/2016/02/12/ucla-open-sources-image-detector-that-can-see-what-we-cant/

UCLA has released the source code to powerful image detection software that can see an object's every detail at high speed -- key for applications like fingerprint and iris scanning, or self-driving cars. It starts by identifying an object's edges and then looking for and extracting its other, fainter features. For instance, if there are items with textured surfaces in the image, the algorithm can recognize and enhance them, as you can see in the example below the fold. It can even see through bright lights to detect their sources' structures, such as lamps, LED lights and even the moon.

The Phase Stretch Transform algorithm was developed by UCLA professor Bahram Jalali, senior researcher Mohammad Asghari and their team. Their project is a spin-off of the university's research on photonic time stretch that can be used to detect cancer cells. It's also the secret behind what UCLA once called the "world's fastest camera" that can capture events that happen very, very fast. The algorithm is now up and available to the public on GitHub and Matlab Central.

Source: UCLA

Read More...