Artificial Intelligence

Bias by Design: Why AI Personalization Often Excludes People with Disabilities—And How Retailers Can Fix It

By Brandi Thompson

AI is helping retailers deliver more tailored shopping experiences but without careful design, it can unintentionally exclude people with disabilities by hiding critical accessibility features. This post explores how bias enters personalization systems and what retailers can do to create more inclusive, accessible experiences for all customers.

AI-driven personalization has revolutionized the online shopping experience, helping retailers tailor content, product recommendations, and content to individual customers. But as powerful as AI is, it’s not without its flaws—especially when it comes to inclusion.

Without intentional design, AI-powered personalization can exclude people with disabilities by filtering out critical accessibility features or failing to account for diverse user needs. This exclusion doesn’t just harm customers; it also undermines the very goals of personalization: creating relevant, enjoyable, and accessible shopping experiences for everyone.

Below, we’ll unpack how biases in AI personalization happen, why accessibility must be part of your personalization strategy, and what retailers can do to build more inclusive and equitable digital experiences.

When AI Gets It Wrong

Bias in AI doesn’t appear out of thin air, it’s a byproduct of the data used to train algorithms and the decisions made during system design. Even well-meaning teams can introduce bias, especially when accessibility isn’t baked into the design from the start. In retail, these biases can show up in several key ways:

1. Biased Training Data

AI models are only as inclusive as the data they’re trained on. If training datasets don’t reflect the full diversity of your customer base, including people with disabilities, older adults, or those using assistive technologies, your system may miss the mark. This lack of representation can result in:

  • Produce recommendations that ignore accessibility-related products or features
  • Interfaces that don’t respond well to assistive technology
  • Hidden accessibility settings that aren’t surfaced in personalized views

For example, if an AI system hasn’t seen user behavior typical of a shopper using a screen reader, it might conclude that alt text or semantic HTML is unnecessary. This creates a poor, and often unusable, experience for blind or visually impaired customers.

Beyond usability, these oversights can open retailers up to legal and reputational risks. In 2023, a blind customer, Ali Abdulhadi, sued Walmart, alleging that its website was not accessible and therefore in violations of the Americans with Disabilities Act (ADA). While this case wasn’t about AI specifically, it demonstrates how digital inaccessibility—regardless of the cause—case have serious consequences.

2. Implicit Human Bias

AI doesn’t just inherit bias from data, it can also reflect the assumptions of the people who build it. If developers or data scientists unintentionally deprioritize accessibility features or fail to test for a broad range of user needs, bias becomes baked into the system.

Consider this: If your personalization engine assumes that audio descriptions or closed captions aren’t important, visually impaired or hearing-impaired users may never encounter accessible content. The algorithm might deem these features irrelevant simply because they don’t appear to drive clicks in the majority population.

This kind of implicit bias leads to personalization that favors the “average user,” a concept that often erases those with different needs and abilities.

3. Personalization That Creates Barriers

In today’s retail environment, personalization filters content based on behavioral data like browsing history, past purchases, or device type. The goal is to show each user only the most relevant content.

But personalization can inadvertently create a “digital divide” by hiding or deprioritizing accessibility features that some users depend on. For example:

  • A hearing-impaired shopper who enjoys video tutorials may not be shown captioned videos if the algorithm assumes their preferences don’t include them.
  • A neurodivergent customer might benefit from simplified navigation or reduced visual clutter, but if their prior behavior doesn’t explicitly signal this, the system might not offer it.
  • Alt text, keyboard navigation, and screen-reader compatibility may be filtered out of a personalized experience because the AI deems them irrelevant to the majority.

This can result in entire user groups being excluded from the personalized experience altogether, or worse, facing digital environment that are more difficult or impossible to navigate.

How Retailers Can Build More Inclusive AI Personalization

Personalization and accessibility are not competing priorities, they’re complementary. In fact, the most successful retailers in the years ahead will be those that design AI systems that serve all their customers, not just the statistical majority.

Here are practical steps you can take to make sure your personalization efforts are inclusive:

  • Diversify your training data: Include examples from users of different abilities, assistive tech, and access needs. This helps your AI learn patterns that reflect real-world diversity.
  • Make accessibility features visible by default: Don’t hide key features behind personalization filters. Caption toggles, alt text, and screen reader-compatible layouts should always be accessible.
  • Give users control: Let customers set accessibility preference directly in their profile or session settings, and make sure those preference persist across visits and channels.
  • Design inclusive algorithms: Train your AI models to recognize and prioritize accessibility needs, even when they aren’t the most common user behaviors.
  • Audit regularly: Conduct ongoing accessibility reviews of your personalization engine, not just your general UI. Include testers with disabilities to validate real-world performance.
  • Be transparent: Tell your customers how their data is being used and give them a say in how their experience is shaped.
  • Introduce explainability: Personalization can often feel opaque—especially when AI systems hide or prioritize content in ways that affect accessibility. Simple explanations like “Recommended because you viewed X” or “Your accessibility preferences are shaping this view” can build trust, help teams debug issues, and surface patterns of exclusion early.

Inclusive Personalization = Better Personalization

AI-driven personalization is only truly effective when it serves everyone. By addressing algorithmic bias, designing for accessibility in mind, and putting inclusion into your strategy from the start, you’re not only doing the right thing but building a smarter, more resilient retail experience.

Concord can help you uncover the sources of bias in your data and personalization models, and we offer practical design interventions that enhance customer experience while driving revenue growth and operational efficiency.

Our experts work with you to build inclusive personalization engines powered by ethical data practices and accessibility-first design to make sure your experience works for all customers, including people with disabilities. We also recommend regular audits to promote transparency and continuous improvement.

Reach out to learn how we can help you deliver more inclusive, equitable, and impactful personalization.

Sign up to receive our bimonthly newsletter!

Not sure on your next step? We'd love to hear about your business challenges. No pitch. No strings attached.

Concord logo
©2025 Concord. All Rights Reserved  |
Privacy Policy