1. Recommendation systems do not consider the needs of people with disabilities
Netflix has spent many years, and many millions of dollars, creating a state-of-the-art system to recommend videos. It classifies everything on its platform into hundreds of different categories, based on genre, mood, and many other characteristics. It uses the latest techniques in data analytics to create algorithms that try to recommend just the right video to every person, at just the right time. However, if you are someone who relies on audio descriptions or closed captions to enjoy movies, many of these recommendations will be useless to you.
Why? Because none of Netflix’s algorithms seem to take those needs into account. Whether or not a movie or show includes audio description or closed captions doesn’t seem to be a datapoint used by the system at all. And it’s impossible for users to tell Netflix that they only want to consume content that has audio descriptions.
Similarly, games recommended by Steam, or apps recommended by the Apple App Store, are equally useless to people with disabilities. Sure, I might enjoy the things that they recommend. But they’ve failed to consider the most important first step: are any of those things even accessible to me? Because of this lack of consideration, instead of making life easier or better, they just add irrelevant noise to the experience of myself and many people with disabilities.
2. Advertising systems have no concept of accessibility
Personalized advertising has revolutionized the ad industry and user experiences. It promises to present people only the ads they might be interested in, and it promises businesses that it will only advertise them to potential consumers. In order to make this happen, there are enormous databases tracking what we click on, what we search for, and where we surf.
However, for people with disabilities, the interest profile that ads track is likely to be far less accurate. Unfortunately, many ads for products that I might be interested in purchasing, are completely inaccessible, so they never get my clicks, or my attention.
On the other hand, some ads for products that I find far less interesting are accessible, so I do focus on them, read them, and maybe even click on them. So, the “interest profile” that’s been built for me is highly tilted towards products that have ads that are accessible to me – rather than products I might actually want to purchase.
Sadly, this makes advertising far less useful to me, and can even affect the deals that I’m offered when shopping online, all because the system hasn’t been designed to take my needs into account.
3. AI-based video analyses do not account for people with disabilities
AI is being used to analyze videos of people, in all sorts of contexts, for all sorts of reasons. One quickly growing field where this technology is used is proctoring online exams. In order to make sure students aren’t cheating, AI is used to track students’ eyes, to see if they’re frequently looking away from the screen. However, these systems fail to consider people who can’t see – and who never look at the screen at all. Similarly, AI fails to correctly track the emotions and attention of videos of people with disabilities.
Another rapidly growing use of AI is in online job interviews. Using micro expressions, eye tracking, and other physical indicators, companies claim that AI can judge someone’s emotio