While watching this we invite you to imagine Gabriel White’s alluring and smart Australian accent explaining to you the language and affordances of gestural UI (slide 17-21). When SXSW releases the audio we’ll link it up here. Until then, here’s an overview to accompany the slides.

Where’s the opportunity? Sensors are in place for incorporating gestures in a variety of devices and use cases but there may be a tendency to add them simply because you can. We gently remind designers; just because you can, doesn’t mean you should. The opportunity (and the challenge) for designers is to write the rules of gestural UI and Gabe has puts forward several guiding questions to help designers ensure gestures add meaning and possibly the right level of fun to the device experience.

One key example is shaking. The Sansa Shaker music player (slide 40) needs no screen; the device’s shape invites users to shake it. The mental model of shaking the device to randomly rearrange songs in a play list comes readily. UrbanSpoon’s iPhone app (slide 43) extends the metaphor appropriately, letting users randomize the search results of local restaurants. Then the examples get more tenuous. A gesture like shaking can quickly degrade in value, and even detract from an experience as it gets loosely translated. Applications like FaceBook for iPhone (slide 44) let you shake to refresh your friend feed — not something that fits the mental model of randomizing or shuffling. Then mCoolPhone (slide 46) fully breaks the model by letting you to shake to answer a voice call.

Guiding Questions for Gestural UI Design:
1- What’s the metaphor?
2- What are the affordances?
3- Is the application specialist or generalist?
4- How can I use gesture to disambiguate?

Trackback URI | Comments RSS

Leave a Reply

You must be logged in to post a comment.