devxlogo

Paralympian Calls For Inclusive AI

paralympian calls for inclusive ai
paralympian calls for inclusive ai

A decorated Paralympic swimmer, Jess Smith, is urging technology leaders to center inclusion in artificial intelligence development. In recent remarks, Smith called for people with disabilities to be visible and heard as companies design and deploy AI tools. Her message comes as AI systems reach into education, health care, media, and sports, raising questions about who benefits and who gets left out.

Smith’s comments highlight a growing push for representation across industries that are using AI. The call is clear: those impacted by the technology should have a role in shaping it. For athletes with disabilities, that can mean better training tools, fairer officiating, and more accessible fan experiences.

Why Representation Matters In AI

“Representation means being seen as part of the AI world that’s being built.” — Jess Smith

Smith’s point speaks to a simple idea with high stakes. AI systems learn from data and from the people who design them. If people with disabilities are not included, products can miss basic needs. That gap can turn into biased outputs or features that exclude users.

Representation also shapes priorities. Diverse teams are more likely to flag accessibility issues early, choose inclusive datasets, and test features with different users. That work reduces the risk of tools that fail in real settings, from voice systems that mishear to interfaces that screen out talent.

Sports And Technology Converge

AI is already influencing training and performance analysis. Coaches use computer vision to study form. Broadcasters rely on automated captions and highlight reels. For Paralympic sports, these tools can open access and boost fairness, but only if they work for the people on the field and in the stands.

See also  US Gas Projects Linked To Data Centers Soar

Smith’s appeal suggests practical steps. Inclusive datasets could reflect a wide range of bodies, equipment, and movement patterns. Officiating aids could be tested with adaptive equipment. Broadcast technology could be held to high accessibility standards, including accurate captions and audio description.

Risks Of Exclusion And Bias

When AI does not account for disability, the harm is immediate. Training systems may misjudge technique if models were built only on non-disabled athletes. Hiring filters may screen out candidates who use assistive tech. Customer support chatbots may fail to handle essential accessibility requests.

Advocates warn that these errors are not edge cases. They are design choices. Without representation, teams may not spot failure modes until after launch. That can damage trust and invite legal and reputational risk.

What Inclusion Could Look Like

Experts across accessibility, ethics, and product design have outlined practical measures that align with Smith’s message. These steps are not only ethical; they also improve product quality and market reach.

  • Hire people with disabilities across research, design, and testing roles.
  • Co-design features with athletes, coaches, and fans who use adaptive equipment.
  • Audit datasets for representation gaps and label accessibility attributes.
  • Test outputs with assistive technologies like screen readers and alternative input devices.
  • Publish accessibility performance metrics along with accuracy or speed claims.
  • Create feedback loops so users can report issues and see fixes.

Balancing Speed With Accountability

Companies often face pressure to ship quickly. Smith’s comments press for a different bar: ship responsibly. That does not require slowing innovation. It requires planning for inclusion from the start. Early engagement with affected users can prevent expensive redesigns later.

See also  Rising Costs Define AI Arms Race

Policy is also part of the picture. Accessibility standards, procurement rules, and disclosure requirements can raise expectations across the sector. Clear guidelines help teams build to a consistent benchmark rather than retrofit fixes after release.

Smith’s call to be “seen as part of the AI world” is a reminder that technology is a social choice. AI will shape training, media, and work for years to come. The question is whether it will reflect the people it serves. For sports and beyond, the next steps are practical: bring diverse users into the room, measure accessibility as rigorously as accuracy, and treat representation as core product work. Watch for teams that publish accessibility data, invest in co-design, and elevate disabled voices. Those choices will signal whether AI grows more inclusive or repeats old gaps at new scale.

steve_gickling
CTO at  | Website

A seasoned technology executive with a proven record of developing and executing innovative strategies to scale high-growth SaaS platforms and enterprise solutions. As a hands-on CTO and systems architect, he combines technical excellence with visionary leadership to drive organizational success.

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.