Black, brown, and white hand reaching to a set of virtual content layers in the middle of the image

Image Credit: Partnership on AI

How AI Needs To Be Redesigned For People With Disabilities

Estimated read time: 12 minutes

Introduction

As a completely blind screen reader user, AI excites me.  I’m not afraid of big data and I’m optimistic that the changes coming to our world will bring far more positive improvements than negative harms.

When I think about AI, the things I imagine are dreams come true for many in the disabled community. Accurate image descriptions, meaning I’m never left out of anything online ever again. Improved text and image recognition, offering access to visual elements of the physical world in ways that I’ve never been able to achieve.  Experiences customized exactly to my needs, in ways that were previously difficult or impossible. More accurate maps, both of outdoor and indoor locations, offering step by step instructions to get to where I’m going.

Why talk about this now?

Woman in a wheelchair using a laptop and looking at camera.

If you’re looking for a job, it’s possible that AI reads your resume, and even analyzes your video interview.  If you’re taking an online exam, AI is often used to detect if you’re cheating and to decide your results.  When you attend an online meeting, AI may be used to judge your attention level.

Almost every advertisement you see, every product recommended to you, and many of the movies and music that are suggested to you, are influenced or entirely controlled by AI.

Despite widespread adoption, this technology is still in its early days. So it’s critical that we begin working to solve the problems caused by AI and big data now, rather than later.

Even though AI is still in its early stages of development, many companies are working on bringing many of these ideas to reality. Apps like Microsoft’s Seeing AI, and Envision, can already recognize products, objects, and faces, as well as read text and barcodes in real time. Facebook is using AI to suggest automatic alt text on images, Google is using AI to automatically describe images missing alt text, and Microsoft Office is using AI to describe images in Word and PowerPoint.

Of course, the holy grail for those of us who can’t drive – self-driving cars.  The ability to go anywhere, at any time, without getting a ride or depending on public transit, would be life changing. And while self-driving cars aren’t quite a reality yet, companies like Waymo are already using AI to make cars more accessible.

If AI and big data are really going to bring all of us the future they promise, there are still many critical challenges we need to work to solve.

The problems

1. Recommendation systems do not consider the needs of people with disabilities

Blind man with a cane sitting in a park

Netflix has spent many years, and many millions of dollars, creating a state-of-the-art system to recommend videos.  It classifies everything on its platform into hundreds of different categories, based on genre, mood, and many other characteristics.  It uses the latest techniques in data analytics to create algorithms that try to recommend just the right video to every person, at just the right time.  However, if you are someone who relies on audio descriptions or closed captions to enjoy movies, many of these recommendations will be useless to you.

Why? Because none of Netflix’s algorithms seem to take those needs into account.  Whether or not a movie or show includes audio description or closed captions doesn’t seem to be a datapoint used by the system at all.  And it’s impossible for users to tell Netflix that they only want to consume content that has audio descriptions.

Similarly, games recommended by Steam, or apps recommended by the Apple App Store, are equally useless to people with disabilities.  Sure, I might enjoy the things that they recommend.  But they’ve failed to consider the most important first step: are any of those things even accessible to me?  Because of this lack of consideration, instead of making life easier or better, they just add irrelevant noise to the experience of myself and many people with disabilities.

2. Advertising systems have no concept of accessibility

blank billboard

Personalized advertising has revolutionized the ad industry and user experiences.  It promises to present people only the ads they might be interested in, and it promises businesses that it will only advertise them to potential consumers.  In order to make this happen, there are enormous databases tracking what we click on, what we search for, and where we surf.

However, for people with disabilities, the interest profile that ads track is likely to be far less accurate.  Unfortunately, many ads for products that I might be interested in purchasing, are completely inaccessible, so they never get my clicks, or my attention.

On the other hand, some ads for products that I find far less interesting are accessible, so I do focus on them, read them, and maybe even click on them.  So, the “interest profile” that’s been built for me is highly tilted towards products that have ads that are accessible to me – rather than products I might actually want to purchase.

Sadly, this makes advertising far less useful to me, and can even affect the deals that I’m offered when shopping online, all because the system hasn’t been designed to take my needs into account.

3. AI-based video analyses do not account for people with disabilities

AI is being used to analyze videos of people, in all sorts of contexts, for all sorts of reasons.  One quickly growing field where this technology is used is proctoring online exams.  In order to make sure students aren’t cheating, AI is used to track students’ eyes, to see if they’re frequently looking away from the screen.  However, these systems fail to consider people who can’t see – and who never look at the screen at all.  Similarly, AI fails to correctly track the emotions and attention of videos of people with disabilities.

Another rapidly growing use of AI is in online job interviews.  Using micro expressions, eye tracking, and other physical indicators, companies claim that AI can judge someone’s emotio