Future Lab: Context Aware

April 4th, 2011 |
Cover image
Share this post:
Facebook | Twitter | Google+ | LinkedIn | Pinterest | Reddit | Email
 
This post can be linked to directly with the following short URL:


 
The audio player code can be adjusted to different sizes:


 
The audio player code can be used without the image as follows:


 
This audio file can be linked to by copying the following URL:


 
Right/Ctrl-click to download the audio file.
 
Subscribe:
Connected Social Media - iTunes | Spotify | YouTube | Twitter | RSS Feed
Intel - iTunes | Spotify | RSS Feed
 

One of the next frontiers of computing is to create systems that understand the user. Context aware devices of the near future might recommend restaurants, monitor a user’s health, or screen phone calls—all based on information collected from device sensors and casual input data. Two of the most profound sensors are already on our devices: microphone and camera. In this episode of Future Lab we talk with researchers who are exploring how to make devices context aware.

Interviewees:
Andrew Campbell, Professor of Computer Science, Dartmouth College
Rosalind Picard, Professor of Media Arts and Sciences, M.I T.
Lama Nachman, Senior Researcher, Intel Labs

Demos:
Mobile Sensing Group at Dartmouth

See photos on Flickr.

See also:
Raising the IQ on Smartphones

Tags: , , , , , , , , , , , , , , ,
 
Posted in: Audio Podcast, Future Lab Radio, Intel, Intel Free Press, Intel Labs, Research@Intel