Person with hearing aid using a mobile device

Why You Need to Do User Testing with Deaf and Hard of Hearing People

Reading time: estimated 7 minutes

Meryl K. Evans, Equal Entry, Director of Marketing.  She argues for why you need to conduct user testing with deaf and hard of hearing people, and also describes why their needs are often overlooked. Further, Meryl speaks to how to test captions and transcripts for maximum accessibility.

One of the most powerful and effective ways to ensure your digital content is accessible and inclusive is by involving people with disabilities. However, one group is often overlooked. And that’s the deaf and hard of hearing (HH).

Development teams think the deaf/HH can navigate websites and digital products with little trouble. And that all they need is captions and transcripts for any audio.

Thus, development teams do their own testing with captions and transcripts. They figure turning off the sound is enough to mimic the deaf/HH experience.

It’s not.

And besides, there’s more to testing captions than checking accuracy.

How to test captions

While many people who aren’t deaf/HH use captions, they don’t depend on captions like deaf/HH people do. It’s a wholly different experience for those who need the captions. They have no fail-safe backup like hearing people do by turning on the sound.

Testing captions isn’t as simple as you might think. Here are 10 factors to pay attention to when checking the quality of captioned videos.

1. Readability

If the video contains closed captions, then you don’t have to worry about readability. It defaults to the standard caption format. And depending on the platform, viewers can customize the captions to their preference.

Development and QA cannot identify whether the open captions are readable. Open captions are captions that always show up on the video. You cannot turn it off and on. The captions are essentially an image glued on the video.

Sometimes you need to use open captions for a video. This is especially true of mobile social networks like Instagram and TikTok. If you use Facebook, Twitter, or LinkedIn mobile app, then you’ll need to upload open captions from the device. These networks accept caption files when on a desktop, laptop, or another computer.

Unfortunately, many of the open caption styles offered in mobile apps are not accessible. They violate many of the upcoming rules of quality captions. It’s not just contrast you need to consider. If the captions are ALL CAPS or animated, they will create a poor captioning experience.

2. Accuracy

Deaf/HH watch captioned videos and review transcripts every day. Hours and hours of it. By comparison, development and QA only spends a few minutes watching a video to check the captions for accuracy. They can do that, but they have to be careful not to let their hearing fill in the blanks for missing captions.

3. Synchronized

Hearing developers and testers can also check synchronization. But it’s very important to pay attention to the audio and the captions. They must be in sync. Turning the sound off is going to be harder for hearing people to catch out of sync captions. That’s because they often don’t depend on lipreading for listening like some deaf/HH do.

Yes, deaf/HH people can tell when a video is out of sync. Even with no sound. It could be the captions don’t match the action on the screen. Or they don’t match the lip movements.

4. Length

The next big factor in high-quality captions is length. This is another one where many companies falter. The lines will be too wide or too short. Or they’ll show three more lines of captions. The ideal length is one or two lines with no more than 32 characters per line. 

Long lines of captions convert the captioning experience from scanning to reading. The most effective captions are scannable. They allow people to view the video without being hung up on the captions. But when the captions are long, then it forces people to read and miss out on the action on the video.  

Another problem with length is bad breaking points or line division. This is the last word of the caption lines. When not done right, it causes cognitive overload and confusion. 

It’s one of those things deaf/HH will notice and hearing people won’t. Captioning Key’s line breaks section is the best one. Check out the examples on the page and you can see what a difference it makes. 

Here are examples of bad breaking points: 

ending. Starting another sentence
in one line. 

Splitting names like the author is Meryl
Evans. 

Meryl Evans should stay together. It’d be better to do this: 

Splitting names like the author
is Meryl Evans. 

She ate an orange, banana, and
apple for breakfast. 

Avoid ending lines after a conjunction, so the following works better. 

She ate an orange, banana, 
and apple for breakfast. 

And then there are captions that leave you hanging like ending a caption on words like “to,” “and,” for” and you have to wait for the next caption to complete the thought. Little things like this minimize cognitive overload. 

5. Position

The next factor in quality captions is position. This is the easiest one. Put the captions on the bottom. You can move the captions up temporarily to display on-screen text. Just be sure to bring them back down once the text clears. I’ve conducted many polls on position and 99 percent choose the bottom. 

When I watch videos with the captions at the top the whole time, I miss a lot more of the action on the screen. Some explain this by saying the captions on the bottom put them closer to people’s faces.  

However, the camera pans out for many scenes where you can see the whole person or maybe no one. There’s no scientific study, but many of us agree that we can watch more of the video with captions on the bottom than on the top. Captions belong on the bottom with occasional exceptions. 

6. Sound

Sound, or lack thereof, is a real problem that frequently happens in international films. These films contain captions for audio that’s not in the viewer’s language. And that’s it. If you’re watching a Japanese film. Anytime they speak Japanese, the captions will show the English version. 

But if they switch to English, there will be no captions. If there is music, song lyrics, or important sounds, those won’t be captioned either. This is one of those things you can’t turn off the sound to test the captions. And sometimes living with hearing, you become accustomed to sounds that you don’t think about it being missing for the deaf/HH. 

7. Credits

The next factor is also easy to catch. Whenever text appears on the screen, the captions should not overlap the text. Viewers want to see both the captions and the onscreen text or credits. Simple as that. 

8. Voice Changes

A voice changes for a reason. A person speaking hoarsely may be sick, losing their voice, or something else. Sometimes a person imitates something, someone, or animals. Those need to be highlighted. When a person suddenly speaks softly or loudly, this needs to be mentioned.  

The deaf/HH can catch confusing points in voice changes and speaker identification. Like the time I saw a performer singing nursery rhymes on a show. I couldn’t figure out why people applauded his singing. The next time he was on the show, the captions revealed the reason his singing impressed the audience. He imitated famous rappers. 

9. Speaker Identification

Hearing people can often identify a speaker as they can hear the person’s voice and recognize the speaker. That’s not the case for the deaf/HH. I was watching a scene. I replayed it and replayed it. I needed to know who spoke a specific line. There was no telling from watching people’s lips as it was a fast conversation with four people. Speaker identification matters. 

10. Flow / Movement

And finally, flow or movement is mainly a problem with open captions and live captions. Some of the mobile apps provide caption options with moving or scrolling captions. The only time captions can get away with scrolling or rollup is when it’s a live show.  

Moving captions are a problem for people with vestibular disorders, migraines, and reading disabilities. Pop in captions create a better experience because they allow the viewers to read at their own pace. This is where one or two lines of captions pop in and then pop out. 

When I attended a captioned virtual reality presentation, I ran into a problem few hearing people will notice. I didn’t feel so great because reading the captions was like watching a ping pong match. No matter what I tried to do to minimize the head movement, it didn’t work. 

Why You Need Transcripts 

Captioned videos aren’t the only thing that deaf/HH need. Transcripts for videos and podcasts are also important. Whenever possible, you want to offer both captions and transcripts. Some deaf people prefer captions. Some prefer transcripts. And people who use refreshable Braille or screen readers need transcripts. These don’t work with captions. 

And too many of them are not readable. They contain large blocks of text with very few paragraph breaks if any. This is not scannable. This is not readable. It causes cognitive overload, and no one wants to read it.  

Ask the development and QA teams to read the transcripts without playing the video or podcast. It’ll help them recognize how painful it is to read a poorly formatted transcript. 

Adding Non-Sound Alerts 

Apps love to tell us to turn on the notifications so they can alert us with updates. Anything to keep us using the app as much as possible. Thus, alerts are another important feature to incorporate into your digital content and products. 

Many apps and websites use sounds to notify us that there’s an update, change in status, or a new message. It helps to provide at least two non-sound notification options. Non-sound alerts could be a popup, a badge, text that stands out, or haptics, which creates an experience of touch with vibrations or motions.  

Some people prefer haptics to visual alerts. While others prefer visual alerts to haptics. Or they may have different preferences depending on the alert type. Developers and QA can’t test for these by turning off the sound. You know where to look.  

Deaf/HH testers, on the other hand, won’t expect it. And they’re the best ones to tell you if they know how to set the alerts they want and notice them.  

What Deaf/HH Will Notice That Others Won’t 

Digital products and websites don’t exist in silos. They’re part of a company’s processes that also include real-world scenarios. For example, a customer needs to contact a company. They go to their website for contact options.  

Deaf/HH customers will notice a website only offers a phone number as a contact option. Or maybe the process asks customers for a callback number. Always offer at least two input options and two modern contact options. Modern means not using a fax number or a mailing address as a contact option. 

The digital process may work perfectly fine for the deaf/HH customer. But there could be a problem with the other part of the process like the time I had to go in the drive-through for COVID-19 testing. That was a disastrous experience.  

Another customer wanted the drive-through and they demanded she go inside the building. Well, she has mobility issues and decided to go find another business that did drive-through testing. This same process failed a blind person because he couldn’t sign up for testing without someone’s help. One process put up barriers for three people with different disabilities. 

Deaf/HH testers can catch other problems with an app or website. Maybe the app only accepts speech as an input method. Some deaf people have an accent, and some don’t speak at all. Speech-to-text is a real problem and it’s becoming more pervasive.  

Don’t Rely on One Deaf/HH Person 

Like autism is a spectrum, deaf/HH is also a spectrum. When you’ve met one deaf person, then you’ve met one deaf person. We’re all different. When hiring a sign language interpreter, you’ll want to work with deaf people who prefer sign language to verify anything having to do with the interpreter passes muster. 

If you interview only one deaf person, then your product will most likely fail. One deaf person declared speaking for all of us when he said we don’t like haptics. A few of us deaf folks chimed in saying that we like haptics. This doesn’t mean you have to interview many deaf/HH people to cover the spectrum. Ensure you have a variety when interviewing five or six customers. 

It’s time for companies to start including deaf/HH users and customers in their testing process. Better yet, hire them and ask for their input throughout the product life cycle. Don’t wait until you’ve created a minimum viable product. It’s faster and cheaper to do it right the first time than to retrofit accessibility.