A first reaction might be, if users can’t see the screen, how can they know where to touch? It might seem impossible to design touch-driven interfaces for vision-impaired users. And if it’s impossible, then you don’t need to try. Wrong: it is not impossible, and it definitely is worth the effort to make touchscreen designs accessible, particularly since touch is the interaction modality for all modern mobile devices.

Not many designers, developers, and UX professionals have had the occasion to dive in and learn about how people who are blind or have low vision use touchscreens. To be honest, I hadn’t until I attended a conference on accessible technologies.

That first day, in a hotel conference room, I was struck by the number of people attentively tapping, typing, and swiping on their touchscreen phones and tablets, with the screens turned off. They wore headphones as they listened to screen readers speak the text on their screens. Before the conference, I had tried out my phone’s built-in screen reader a few times, mostly as a way to have long articles read aloud to me. But I’d never taken the plunge to use the web or applications with the screen off.

iPhone accessibility settings menu
The iPhone’s built-in screen reader, VoiceOver, is available from the Accessibility menu in the phone’s General settings.

As a sighted person unaccustomed to using a screen reader, relying only on spoken words to interact with my phone quickly became exhausting. I was impatient that I couldn’t quickly glance down to see what else was on the screen. I had to wait for the reader to announce something interesting, or slide my finger across the screen and hope to hear the keywords that I wanted. At one point, I accidentally activated my browser’s settings menu and couldn’t, for the life of me, figure out what had happened. Having the screen off also tested my recall ability, especially when typing. Normally, I type fast and can even type without constantly looking at the keys. But, because there is no haptic feedback on on-screen keyboards, when you can’t see the keys at all it’s much more difficult. For example, imagine you can’t see the screen and you want to search for an extension cord online. You’re in the search box and you’re aiming for the letter X, but the screen reader tells you that your finger has landed on C. Do you move your finger left or right? Trick question — the correct answer is that you give up and use the dictation tool, because it will take you forever to type the phrase “extension cord”.

Mobile Devices Are Convenient for Everyone

People who are blind or have low vision rely on touchscreen mobile devices for the same reasons sighted people do: portability and on-the-go convenience. Texting, sending quick emails, making phone calls, looking up tomorrow’s weather, catching up on the latest news headlines, setting reminders and alerts for ourselves — we all appreciate the quick access to these when we’re on the go. People with low vision are no exception. Moreover, some applications designed specifically for people who have visual impairments help them identify colors, currency, labels, and even objects.

Screenshots of TapTapSee app
TapTapSee is an application that helps recognize objects in the real world. Users take a picture of an object with their device’s camera, and the app identifies what the object is, speaking the description aloud. In the screenshots above, it correctly identified a US $5 bill (left), and even specifically identified my “MacBook Pro and gray wireless Apple keyboard on brown wood table” (right). Pretty impressive.

Screen Readers and Gestures on Touchscreen Devices

For people who have low vision, using a touchscreen typically consists of listening to text read aloud by a screen reader and interacting with the elements on screen via a lexicon of multi-finger gestures.

Screen readers are software programs that identify the elements displayed on a screen and repeat that information back to the user, via text–to–speech or braille output devices. While sighted people visually scan a page, people who have visual impairments use screen readers to identify text, links, images, headings, navigation elements, page regions, and so on.

Listening to a screen reader requires a significant amount of focus and significantly increases the user’s cognitive load. You have to pay attention as the text is spoken, so that you can figure out what’s on the page, what is interesting to you, and whether or not an element is actionable. Unlike visual web pages, screen readers also present information in strict sequential order: users must patiently listen to the description of the page until they come across something that is interesting to them; they cannot directly select the most promising element without first attending to the elements that precede it. However, some amount of direct access is available. If users expect news headlines to be in the middle of the page, they can place a finger in that general area of the screen to have the voice reader skip the page elements preceding that position, thus saving the time of listening to the entire page. If users expect the shopping cart to be in the upper right corner, they can touch that part of the screen directly.

If you miss something, you can’t glance back. Instead, you flick one finger across the screen until you hear what you missed. Listening to a page being read aloud requires that users hold a lot of information in their short-term memory. Consider the task of listening to a waiter reciting a long list of specials: you have to pay attention and remember all of them as you’re deciding your choice of entrée for the evening, The same happens when you’re listening to a screen reader, only on a larger scale.

Modern mobile devices come with built-in screen readers: on Android devices, the text-to-speech program is called TalkBack, and on Apple iOS devices, it is VoiceOver. Below is a video demo of VoiceOver on an iPhone running iOS 8.3.

In most browsers, hover over the video to display the controls if they're not already visible.

(You can try it yourself. Turn on your device’s screen reader in the Accessibility section. For iPhones, go to Settings > General > Accessibility > VoiceOver. For Android, go to Settings > Accessibility. Give it a try for a few hours. Good luck. It takes a while to get the hang of it, but it will come.)

Browsing on touchscreen devices involves a range of gestures, many of which offer far more functionality than the tap and swipe gestures of the sighted world. To give you a better idea, here is a sample of some of the most common gestures for VoiceOver:

  • Drag one finger over the screen to explore the interface and hear the screen reader speak what’s under your finger.
  • Flick two fingers down the screen to hear it read the page from the top down.
  • Single tap brings a button or link in focus (so you know what it is); double tap activates the control.
  • 3-finger horizontal flick is the equivalent of a regular swipe.
  • 3-finger vertical flick scrolls the screen up or down.

As you can see, the vocabulary of gestures that users with low vision have to learn is quite wide. We know that gestures have low discoverability and learnability, yet for power users they do represent the only way to navigate efficiently through a system largely based on sequential access.

Implications for Design

A lack of visual information is taxing on the user experience, because people cannot quickly glance around a page, scan a list of menu choices, or aim with certainty at a target. They cannot use visual cues to detect page hierarchy, or groupings, or relationships between content, or the even the tone of an image. They must discern this information based on the text that is spoken by the screen reader. Add to that all the gestures they have to learn in order to interact with a website or application. It’s a lot of information to keep track of, and we haven’t even mentioned the demands of understanding the content itself.

Designers and developers need to keep users with low vision in mind when creating interfaces and applications. (Of course, they should strive for inclusive design, which considers people with all types of impairments in mind, such as cognitive and motor impairments. But for this article, we’re focusing on vision impairments.) Below are a few suggestions.

Remember that usability issues for sighted users are amplified for users who are nonsighted.

If the user experience for sighted users is remotely challenging, the experience is going to be much more challenging for people who are blind. Very often, improving the interface for sighted people goes a long way toward improving site for people with visual impairments.

  • Focus on cutting out extraneous copy and functionality that add little benefit to users. Less text on a page means there are fewer words for sighted users to scan, and less text for screen-reader users to listen to.
  • Reduce interaction cost where possible by reworking workflows so that people can accomplish tasks more efficiently.

Invest in writing cleaner, more accessible code.

Remember, users with vision impairments may not be able to perceive visual cues such as grouping and color schemes to determine relationships between elements or to assess hierarchy. Instead, they rely on cues from the code being read: heading levels (H1, H2, etc.) convey main content sections. Text that ends with the identifier “link” or “button” tells users that an item is actionable.

  • Incorporate alternate text for images, describing anything that isn’t apparent from the text on the page.
  • Make pages easy to navigate with the keyboard only, and you’ll address some of the most common usability issues for people using screen readers on mobile devices.

Avoid creating complex gestures for the sake of being unique.

With so many gestures already in use by screen readers, sites that create additional gestures (or hijack existing ones) create huge usability issues.

Conclusion

Designers, developers, and usability professionals need to understand what the experience of using touchscreen devices is like for people who are blind or have low vision. Beyond that, we need remember that compliance with accessibility guidelines is not the end goal: usability is.

For more tips on how to create usable, accessible designs, see our free report: Usability Guidelines for Accessible Web Design.