SixthSence – a wearable gestural interface

In November I was in Oklahoma to the annual Creativity World Forum Conference, which is a part of the “Districts of Creativity”.

One of the keyspeakers was Pranav Mistry, who is a Research Assistant and Ph.D candidate at the MIT Media Lab. He talked about his work developing the very interesting concept of “SixthSense”, which is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information. I think the concept gives a lot of possibilities in the context of the Cross Media world.

The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user’s hand gestures and physical objects using computer-vision based techniques.

The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.

Read more about “SixthSense” here…