Advertisements
Advertisements

Advertisements

SixthSense gesture interface: Real life is the new screen.

| April 14, 2009 | 4 Replies
Advertisements

We’ve seen how with the S60 application, Handwave, users can take advantage of the front camera to convert hand gestures into basic navigation of phone  controls.

[youtube=http://www.youtube.com/watch?v=xvh1eleBsZs]

What if you can turn all your hand gestures into something that your phone can recognise? Produce a square with your thumbs and pointer fingers to frame and capture a picture? What about turning anything into a display, your hand, blank sheets of paper, a potato, all controlled by your phone?

frame-hand

Worn around your neck, the latest device of the SixthSense project uses a camera to detect hand/finger gestures, (much like handwave, just a lot more complex) and a mini projector to display content. This means that everything that you can shine the projector on becomes an interactive display and your hands don’t have to physically touch the phone to let it know you’re going to do something.

hand

That’s the essense of the SixthSense project. To create a device that instinctively knows, “where the user is, what the user is doing, and who the user is interacting with.” says Dr Pattie Maes, head of the Fluid Interfaces Group at MIT, creators of the latest prototype of SixthSense .ews

Imagine going grocery shopping, picking up an item and your phone creates a display on that item, telling you what that it is, what it does, how much it is. Imagine you were in a bookstore – you may get to have reviews about that book, and price comparisons in other stores – all without ever having to tell your phone to search for this item as it knows intuitively what it is, and what you want it to do about that item.

It sounds like a great concept. Although the SixthSense team do not think their platform will replace smartphones/laptops, the intuitiveness to anticipate what it is you want to do could do so much to enhance the growing trend for natural hand gestures to control our electronic devices.

The adaptable “be anywhere” screen is great – but isn’t so much for taking photos . What if you’re zooming or adding effects, your eyes cannot see what’s happening because an image cannot be projected into the empty space between your hands.

What if they use some discreet glasses. When they make everything small enough, this could house the camera for the sensor, a screen so you can have a display (terminator style) for things only you want to see (text/emails) and also a projector when you want to share what it is that you can see with others. Heck, why not have it all in a contact lens. :p

The most important thing I’m taking away from all of this is how devices can “anticipate” what you want to do. No wading through menus, however intuitive they may be. The buttonless phone, once laugable, is now more than quite acceptable. Is it ever possible to see phones without menus? Or at least some radical change into the paradigm of heirarchical menus to something more fluid, with gestures that can instinctively work out what you’re looking for/want to do.

This BBC article contains a video demonstration of the latest prototype with more more details on further applications such as converting sign language gestures into speech and gaming.

Advertisements

Tags: , , , , , , ,

Category: Concept, Gestures, Video

About the Author ()

Hey, thanks for reading my post. My name is Jay and I'm a medical student at the University of Manchester. When I can, I blog here at mynokiablog.com and tweet now and again @jaymontano. We also have a twitter and facebook accounts @mynokiablog and  Facebook.com/mynokiablog. Check out the tips, guides and rules for commenting >>click<< Contact us at tips(@)mynokiablog.com or email me directly on jay[at]mynokiablog.com