What is this? From this page you can use the Social Web links to save Design Considerations for Touch UI to a social bookmarking site, or the E-mail form to send a link via e-mail.

Social Web

E-mail

E-mail It
April 10, 2009

Design Considerations for Touch UI

Posted in: Observations

Designing for touch-based mobile user interfaces
requires new thinking and an expanded design vocabulary.

Carriers and handset manufacturers are increasingly adding touchscreen devices at the premium end of their device line-up. Our challenge is not just designing for touch, but designing mobile UI frameworks built for the range of devices in an OEM’s or carrier’s product line. This means designing for consistency for 5-way keys and full-screen touch UIs. We’ve collected the following insights from Punchcut’s interaction design, visual design and motion design disciplines who have drawn insights from their disciplines to create satisfying touch UI experiences.

1// Design for immediate access

Touch screens allows users to jump from point A to point B with a single tap, influencing the ways we design interactions and screen layouts. The user doesn’t have to step through menu items the way traditional key-based mobile UIs require, so the top-most items on a screen may not necessarily have to be the most important or most accessed feature.

Recognizing that the user can move much more quickly through the interface, it is essential to streamline the UI and make core navigation very clear. Design with quick taps in mind, and use finger travel as a mechanism to guide the user. For example, tapping in the same region can be used to guide the user down a specific primary flow, while allowing the finger travel across screen as a subtle cue that they are leaving a primary interaction flow. Touch keyboard-based input should be used as a last resort.

2// Keep gestures smart and simple

The touch experience is one of direct manipulation — something everyone has experience with. Our physical world is based on direct manipulation and users naturally bring that mindset to a touch experience. Therefore it’s important when implementing gestural controls to make them simple and intuitive. In other words, the foundation of the UI should respond exactly as a user would expect, making taps and flicks essential ingredients. Additional gestures, beyond flicks and taps can certainly be utilized, just recognize that additional gestures may not be naturally discoverable. Use a redundant button and make the gesture a shortcut to the same functionality. These additional gestures require explicit instruction so the user may take advantage.

It’s important to distinguish between global, system-level gestures and local, app-level gestures. In many ways, it is the global gestures that are necessary to keep the UI intuitive and straightforward. Once inside an application context, you may educate users about unique gestures that add to the touch vocabulary while in the application, so long as the gestures do not negate or confuse the global gestures.

3//  Leverage clear mental models

The touch experience is an intimate interaction with the content and UI space. There is an opportunity to transport the user into an interface world that is governed by common rules of physical motion like inertia, bounce and gravity that build and reinforce expectations when the user touches, flicks, or drags interface elements. Dimensionality and/or a sense of physicality may help offset the experience of interacting with the flat aspect of the screen, when feedback may be minimal.

Transition animations used throughout the device experience help confirm that an action has taken place, and may give users a greater sense that they have gone “deeper” into an application context, or shifted over to a parallel task. When used, transitions should recede — they shouldn’t call dramatic attention to themselves. Simple and quick motions keep the user focused on the task at hand, rather than loud or long special effects that move them from point A to point B.

There are no focus states or hover states to cue users in, so iconography and other touchable elements should stand out clearly from page content. The notable exception is T-Mobile’s G1, which uses a trackball and touch input. In this instance a hover state can be mistaken for an an active state as users flip between mental models going from trackball to touch input and back.

Further, a hardware key used for Home, Back, or Menu may be an effective model to let the user feel they are leaving the current on-screen context to access another.

4// Design for real hand sizes

Fingers come in all sizes so ensure that the interface is designed for real people. This has an impact on both the sizing of interface elements and the objects that surround them. For example, when designing for a QVGA resolution, 45-48 pixel hit targets are ideal for the average finger. Tolerances between buttons are driven by the size of the button and the likelihood of accidentally hitting an adjacent element. Generally speaking, the smaller the buttons, the bigger the gaps needed between buttons. Because of this principle, there is usually some tolerance flexibility on the edge of the screen, because the finger is only partially on screen and makes for an easier target. Many in-dash automobile navigation systems successfully use this method.

Hit targets are directly proportional to the screen resolution. Therefore on-device testing becomes critical, to measure the actual size of the resulting interface. A common goal for touchable elements is 10 millimeters minimum, regardless of resolution.

5// Touch feedback is key

User are mobile, and will often be using their mobile devices under compromised and distracted conditions. Without clear feedback, the user must focus more attentively at the task on screen, which may directly conflict with the real-world task they are simultaneously trying to achieve. All touch is not the same; technology plays a key role. The responsiveness of the UI, and whether the screen uses pressure or capacitive touch will influence the level of feedback needed in the interface. Visual, audible and/or tactile feedback will allow them to more attentively focus on what’s most important.

The finger is a blunt instrument, often obscuring the target a user is touching. Creating a visual feedback system that takes this into account is essential, as visual feedback is the one critical type of feedback that is necessary in a touchable environment. Often, visual feedback that is still visible after the user has released their touch may be equally useful to guide a user through an interface.

Audible feedback is often used as a secondary mechanism when visual feedback might not be noticed. However one must proceed with caution when adding audible feedback to mobile devices, because these are often turned off by the user. When creating audible cues, specific ranges of sounds are recommended to cut through the din of the user’s natural environment.

Haptic feedback can offset the difficulty some users face in transitioning from a key-based device to a flat touchscreen.

The haptic response can give the user an illusion that a key has been pressed by utilizing a small vibration under the user’s finger. While vibration technology can account for a spectrum of tactile experiences that are available to the UI designer, one must also recognize that the user may opt to turn off this feedback to conserve battery life.

Interested in talking with us about Touch UI? Contact Punchcut.


Return to: Design Considerations for Touch UI