Wednesday, July 11, 2007

MIT: The iPhone's Untapped Potential

c77-7-10-07.jpg Core77 picks up on a June MIT Technology Review article describing some of the unrealized potential in the iPhone.

"Turns out that, in addition to having the interface to kill all portable interfaces, it is tricked out with a number of just slightly utilized sensors; specifically an accelerometer, an ambient light meter, and an IR motion sensor.

While Apple has applied these to the admirable goal of rotating your screen and adjusting your brightness for you, some other smart people have already been busy using them for more creative ends. Like learning about human nature."
Start with the iPhone, Work Back to Human Nature
Posted by: Carl Alviani on Tuesday, July 10 2007

c77-7-10-07.jpg

MIT Technology Review put a brief article up at the end of June describing some of the unrealized potential in the iPhone. Turns out that, in addition to having the interface to kill all portable interfaces, it is tricked out with a number of just slightly utilized sensors; specifically an accelerometer, an ambient light meter, and an IR motion sensor. While Apple has applied these to the admirable goal of rotating your screen and adjusting your brightness for you, some other smart people have already been busy using them for more creative ends. Like learning about human nature.

Now, take a step back: Accelerometers are motion detectors--they get used to help measure distance walked (pedometers) and the intensity of car crashes (impact meters), among other things. Some creative designers have figured out how to make them fun (Nintendo Wii). It's not a huge stretch to combine this sort of data with light, motion and sound sensing to start getting a picture of what a user is doing all day, moment to moment. Standing, sitting, and walking have recognizable signatures, and from there it's a short computational step to recognizing when a user is cooking, working, hanging out, shopping, etc. It's like a diary, but honest. It's like Twitter, but less irritating.

Now, take another step back: Once again, MIT researchers are way ahead of us. Here's a study group called Reality Mining that's been gathering data in this manner from study participants since 2004, combining it with data on proximity sensing between users, and analyzing the hell out of it. Findings are ongoing, but what's already there is massively intriguing. Social networking in the real world has a statistical signature, and measurable patterns called Eigenbehaviors start emerging. It's still mostly in the realm of statisticians and analysts, but the trajectory points insistently toward a new and powerful tool for designers.

Potential applications are significant for....well, who aren't they significant for? Consumer electronics designers looking for new interface methods; medical and fitness product designers looking for better ways to get information from users to devices; design researchers who want higher quality data from a less-intrusive method: pay attention. Things are changing.