Testing the usability of sound

The challenge with screen reader usability
Image
Person using a screen reader to conduct tests via UserTesting

Accessibility is a marathon, not a sprint. - Natalie Russell, UserTesting Accessibility Lead

For Amber and I, what began as a simple question of “how do we design with sound for software?” sent us on an odyssey around the company, meeting several experts along the way. However, we landed in a much different place than where we began: with the discovery that screen readers are very, very challenging. 

What are screen readers? 

Screen readers, we discovered, are just one example of a tool built to enable and assist those who are sight impaired in accessing content. The screen reader uses AI to read any text it finds out loud for the user (including alt-text and text not visible to the user but present on HTML markup).

Person using a screen reader to conduct tests via UserTesting

How do screen readers work?

Amber and I first met with Scott Anderson, UserTesting Staff Frontend Engineer, who helped us uncover that sound can make—or break—a user’s experience in matters of accessibility or the means of fully utilizing digital products by removing our barriers to inclusion. 

Scott demonstrated a screen reader to see and hear how it worked. The first thing that stood out to us was how dissimilar the experience was to how the content was designed to be consumed.

It wasn’t always clear when the screen reader switched to another paragraph or another page. And what about all the other items that appear on our screens that may not contain language but still signify meaning, such as icons, images, and diagrams? Also, a bigger question: what are the consequences when a company can’t guarantee that all their customers have access to all content in a meaningful format? The impact of doing nothing makes this question not only troubling in matters of empathy but legally problematic, as well. According to Gus Alexiou in Forbes, “Currently, there exists no overlay technology on the market that can, by itself, render a website fully accessible and protect it from ADA lawsuits.” 

As we uncovered more and more information about accessibility and sound, it became apparent that the process of creating and integrating assistive technology for making content accessible is in its infancy. 

While we focus on sound and design, it’s important to note that a physical disability can relate to movement, eyesight, or hearing and that the perspectives of no two individuals who have such impairments are the same. As such, that means catch-all solutions for assistive technology are nearly impossible to find, and users often employ a grab bag of technologies to assist in accessibility. Given the complex nature of the human body, there are no singular sound-accessibility solutions, either. 

However, if you’re thinking about your content (hint: you should) and the incorporation of assistive technology for accessibility, don’t be discouraged. It’ll take time to explore different solutions, but in so doing, you’ll learn so much about your diverse audience, and even better, you’ll ultimately design experiences that help improve and enhance people’s lives.

Sound and screen readers

Next, we wanted to learn more about how teams can develop content with these functions in mind. We sat down with Bryan Tiller, UserTesting Product Designer, who helped us contextualize the kinds of concerns a designer might grapple with when designing for sound and accessibility tools. We wanted to understand the why behind some standard accessibility content practices, such as providing alternate text with an image (and more of these are listed below), and how that translated into real-life experiences for users. 

Bryan noted that when creating content that assists hearing or sight-impaired individuals, you need to present visual information in a way that doesn’t rely on sight and audio information in a way that does not rely on hearing. 

This can feel counter-intuitive and become a major design challenge. However, this is nothing compared to how challenging and frustrating it would be not having access to the content at all. 

For the content or UX writing teams—or the inventor of the technology, it might seem that imagination is your biggest tool, but really, we learned that it’s empathy. Take the time to test how these accessibility tools function, like screen readers. When you witness how a screen reader works and hear how it moves around the page, this insight will inform your content with empathy for those who use these tools.

While there’s a lot of room for progress in this arena, Bryan had some helpful considerations for the content or technology itself (such as screen readers and other voice-powered technologies, keyboard assists, eye-gaze activation tools, etc.) when designing with sound for accessibility:  

  • Alt-text: Have any images been described in alt-text (alternative text)? This differs from an image caption in that the text needs to describe what a user finds in the image, such as actions, people, animals, etc. Be descriptive but brief and add clarity to the image or task. Descriptions that are too long may be cut off when read by screen readers. For example: “A group of panda cubs sitting in the grass.” 
  • Images: Are non-informational, aesthetic images identified and marked in a way that they’ll be ignored by assistive technologies?
  • Captions: Do prerecorded videos have captions? Captions assist those with hearing impairment to know what’s been said, however, they’re not automatically read by screen readers, so won’t help individuals relying on screen readers for context
  • Transcripts: Does prerecorded audio-only and video-only content have a transcript available? Transcripts can be read by screen readers and give valuable context for users with sight impairments.    
  • Video descriptions: Audio descriptions provide an aural context for what is seen on screen for those with visual impairment. They read subtitles for dialogue aloud, as well as add additional context when present, such as music descriptions or imagery within the footage between dialogue.
  • Audio quality and differentiation: There are many factors guiding audio in someone’s space, such as ambient noise in a room, earplugs, a computer, or apps. What will help your audio stand out? 
  • Have you ever seen a screen reader in action? Check one out when you have a chance to do so. It takes time for the voice-assistive technology to read everything out loud. Can you help cut down on that time?
  • Are your sounds soothing for those with sound sensitivities? (Twitter explored this recently, describing how they arrived at their new chirps in this blog post).

Bryan also mentioned that there are unique situations in which a user utilizes a screen reader who doesn't have a disability that requires it. For example, when someone feels they have adequate vision, but utilize the assistance with reading comprehension. 

However, an advanced screen reader app can even override system preferences and circumvent the usual UI. How, then, to balance developing content with these considerations in mind? As Bryan told us, “When someone begins to work with a screen reader, is it remarkable how quickly the brain adapts to assistive technologies, and even multiple assistive technologies.” 

UX research for sound and accessibility will be “learn by doing”

This is where conducting UX research will make a difference in understanding use cases and applications—and where Natalie Russell, Accessibility Lead at UserTesting, helped remind us that having already done UX research prepares us with an empathetic mindset. 

Testing sound and assistive technology for accessibility experiences may feel overwhelming, but it doesn’t have to be. Like all customer research, researching sound and accessibility is a “learn by doing” process, as Natalie notes. This won’t be a skill you acquire overnight. Many companies with good intentions can get it wrong, rushing a product to market before it has been tested thoroughly. 

In the above example involving Twitter, their design team remained thoughtful about their users and determined some of their design criteria to:

  • Promote accessibility for those with sensory sensitivities
  • Blend the organic and digital
  • Keep sounds short and smooth
  • Attract attention without being distracting

Still, knowing that you can’t please all of the people all of the time, the team also made sure to provide the ability to turn off sound effects. 

When conducting testing for sound and accessibility, it’s imperative to include proper representation through all phases of development. Twitter’s design team concluded that, “...designing new, more accessible sounds requires disability representation throughout the design process.”

Additionally, Natalie noted that testing with those who might use your product or consume your content,

...is the only way to step across the empathy gap and ask someone directly ‘What are your needs?’ 

Testing technology builds empathy 

Something Bryan said really stuck with us: he noted that “Testing technology is worth it because it builds empathy.” Even imagining what we might change or adapt and testing to see whether those changes help is a great exercise in empathy. 

Empathy is the foundation of UX research, and this process necessitates human insight. As Alexious said in Forbes, commenting on the application of AI to accessibility compliance, “What is beyond speculation is…AI can no more make a website compliant by itself than a robotaxi can drive you across New York City, or your Amazon Alexa can be a witty participant at your next dinner party.” In other words, it will take more than just technology to improve accessibility; it will mean using human insights to drive that technology—not machines responding to other machines. UX’s purpose is simple and clear, which is to connect with our users. 

Or, in Bryan’s words, “We want all our users to be able to have access to information, regardless of whether or not they can see or hear clearly.” 

For our last leg of the journey, I sat in on a meeting for those interested in upholding accessibility in UX research at UserTesting, led by Accessibility Lead, Natalie Russell. Natalie’s expertise really shined as she discussed the ways we could become more helpful and conscious of our customers and colleagues around accessibility. 

Natalie ended the meeting by noting, “It’s about ‘experience over compliance,’” which struck me as the most profoundly empathetic perspective I’d heard thus far. What Natalie taught me at that moment is that we can set up all of the rules and all of the compliance standards we can–but if our motivation is to satisfy compliance over an investment in fairness and access for all, then we’ve missed the point. 

Our work is not just about satisfying the legalities of access, it’s about providing a meaningful, positive experience no matter who uses our product. Whatever your challenges with sound and content, sound and technology, and so on—that meaningful and positive experience for all is a commitment we must keep for the enrichment of our customers and the betterment of ourselves.

About the authors

Rachel-Blackburn

Rachel Blackburn

Rachel is a writer, educator, and researcher who loves a good qualitative study. She brings her creativity and collaborative spirit with her always and is currently based in Atlanta, GA. You can find her on Linkedin here.

Amber Beercroft

Amber Beercroft

Program Manager of Scaled Programs by day, aspiring blogger by night. Based in Atlanta, GA. 

Reduce risk, manage cost, and stay competitive

Learn how customer insight at every stage of the development process can mitigate risk in product decisions.