Artificial intelligence resource hub

Explore insights, research, and expert voices on how to keep AI product momentum strong, without compromising inclusivity.

AI icon
AI icon

The speed-usability trade off

Companies everywhere are racing to integrate AI into their products, believing it’ll make user experiences smarter, faster, and more personalized. But AI doesn’t follow instructions: it learns from data and patterns that reflect the parameters of the people and perspectives that create them.

Without validation from people with disabilities, AI-powered products can amplify bias instead of improving experiences. In the rush to be first to market, companies risk leaving both people and profits behind.

Equitable AI starts with inclusion

Dr. Jutta Treviranus, a leader in inclusive design, believes that by involving people with disabilities in every stage of the AI ecosystem, from design to education, we can build technology that’s fair, transparent, and works for everyone.

Source: WebAble TV. Accessible and Equitable Artificial Intelligence, Jutta Treviranus, PhD.

Featured content

Why accessibility might be AI’s biggest breakthrough

Browse UK study findings that challenge assumptions about who benefits most from AI tools.

How AI must be redesigned for people with disabilities

Explore the critical challenges we need to solve if AI is really going to bring the future it promises.

Insiders speak on AI and accessibility

See how people with disabilities are thinking about AI in products and the potential of inclusive AI.

AI and disability: Coded futures

Scan research from Open Inclusion to understand why bias is hard to remove once it’s embedded in AI systems.

How AI excludes people

AI reflects the data it’s trained on. When people with disabilities aren’t represented in that data, bias gets built in. Accessibility becomes an exception instead of an expectation — and the result is technology that doesn’t serve everyone equally. Think…

  • Voice recognition that misses unique speech patterns
  • Eye-tracking that can’t recognize blindness
  • Recommendation engines that surface inaccessible content

Products validated by people with disabilities don’t just work better, they earn trust faster across user segments. Features that work well for people with disabilities help reduce friction and complexity for all. AI can unlock these opportunities at scale if it’s trained and validated on the full spectrum of human experience.

Dig a little deeper

AI language models show bias against people with disabilities (Penn State, 2024)

Disability, Bias, and AI (AI Now institute, 2019)

Removing Bias from AI with Dr. Jutta Treviranus (Remarkable Insights podcast, 2024)

Untested perceptions put progress at risk

Three common assumptions about AI and accessibility can create risk where teams expect reward.

Perception 1: We’ll fall behind if we don’t move fast with AI.

Reality: Slowing down a little helps you move faster in the end.

AI product roadmaps often hinge on assumptions. Human validation ensures those assumptions are tested against real-world experience, reducing wasted cycles and catching usability gaps before launch.

  • De-risk innovation by validating ideas early with real users
  • Cut costly remediation efforts (According to Forrester, fixing usability post-launch is up to 100x more expensive)

  • Protect against costly public backlash, litigation, or PR crises
  • Accelerate with evidence-based confidence

“For every hour that a UX designer invests into accessibility pre-launch, we save up to four hours in engineering post-launch, not bug fixing accessibility issues.”

Dirk Ginader
Accessibility UX Engineering Lead, Google

Perception 2: Designing for the “average” user will be most profitable.

Reality: People with disabilities represent a trillion-dollar consumer market.

When you consider people with disabilities, you tap into a highly loyal group of customers. If you get your product wrong, rebuilding trust is a huge lift.

Over 1 billion people live with disabilities around the globe. And it’s one of the fastest-growing user segments due to aging populations.

This group represents $1 trillion USD in spending power. Validating products with people with disabilities unlocks new market share and builds trust faster.

“If an online shopping experience isn’t accessible, and I can get the item from another brand, I’m 100% abandoning the experience. I’m trying something else.”

Portrait of Caitlin W. a woman with mid-length, wavy purple hair, wearing a black top, set inside a circular frame with a pink border.

Caitlin W.
Blind screen reader user and Fable Community member

Perception 3: We use automated tools to check for accessibility. We’re covered!

Reality: Being compliant doesn’t make your product usable.

A product can meet every WCAG rule and still be inaccessible to people using assistive technology.
Your AI product can be compliant and still get a human rights violation. The EU AI Act and other global accessibility mandates are raising the bar.

It’s estimated that nearly 5,000 website accessibility lawsuits will be filed in the US in 2025. Validation with people with disabilities is quickly shifting to a compliance requirement.

“Automated checks will spot some issues, but to truly understand the customer’s experience, we run inclusive usability sessions with Fable. Those tests flag not only technical violations, but also blockers like confusing language or too many steps.”

Headshot of Kasia Pawluk

Kasia Pawluk
UX Researcher, B&Q

Fable bridges the gap between “fast” and “right”

The AI “ship fast, fix later” mentality is doing companies a disservice. Teams that take a thoughtful approach to embedding continuous, real-world human validation today will own the future of AI-enabled products tomorrow.

Fable has the experience and tools to get you there.

  • A diverse Community of people with disabilities who use assistive technology and are standing by to test your products.

  • Experience with working at all stages of product development, from generative and prototyping to beta and QA.
  • Custom inclusive dataset creation for training and testing AI models

  • A free Accessible Usability Scale (AUS) tool to benchmark the usability of your AI-powered products.
  • Officer hours with our experts to help interpret research findings
A man in an office wearing glasses and a green polo shirt sitting at a table and typing on a laptop. The individual is leaning in closely to the screen, with a focused expression.

Only 7% of the Fable Community believe they’re adequately represented in AI development, yet 87% say they’d be willing to provide feedback to product developers.

That gap is where bias grows. It’s also where your opportunity lies. Reach out to our team to get started.

Build AI tools that work for everyone

Fable connects you with people with disabilities to build and validate inclusive AI products.

A man using a joystick