The TED 2009 Conference is happening this week.
One talk about MIT students who have developed an ultra-portable, wearable, and context-aware computer system has received a write-up in Wired Magazine. The online article has a couple of very exciting videos.
The computer is designed to respond to gestures made by the wearer (user). By recognizing gestures of indication, the computer can target an object to which you are paying attention.
The computer can then automatically search the Internet for information about the item of interest. Once the information has been retrieved, it is displayed on the surface of that item.
The computer interface design possibilities are endless.