Digital accessibility is one layer. Digital disability justice looks at the bigger picture:
- Who designs and controls technology
- How data, algorithms, and platforms impact disabled people
- How surveillance, policing, and profit models target disabled bodies and minds
- How disabled communities use tech to organize, care for each other, and resist
This page connects disability justice principles to the digital world.
Compliance frameworks (like WCAG and legal standards) focus on:
- Minimum technical requirements
- Avoiding lawsuits or complaints
Disability justice asks deeper questions:
- Who is still excluded even when a website is “compliant”?
- Whose bodies and brains are assumed to be “normal” in design decisions?
- How do race, class, gender, migration, queerness, and disability intersect in digital spaces?
- Who profits from “accessibility solutions,” and who actually gets control?
Examples of issues that disproportionately harm disabled people:
- Algorithmic discrimination – Automated systems flagging disabled people as “fraud,” “high risk,” or “low productivity.”
- Surveillance tech – Monitoring in schools, workplaces, welfare systems, and institutions that punishes disabled ways of moving, speaking, or behaving.
- Data extraction – Health and disability data used for profit or predictive policing without consent.
- Inaccessible public services – Government systems that only work in certain browsers, require high digital literacy, or are hostile to assistive tech.
- Platform control – Social media companies shaping who gets visibility and who is silenced, with disabled voices often deprioritized or flagged.
Disabled people also use technology to:
- Build support networks and mutual aid groups
- Share survival knowledge outside official channels
- Document abuse and discrimination
- Create art, culture, and new narratives of disability
- Organize campaigns and actions for rights and justice
Digital disability justice recognizes and centers this organizing.
Some guiding ideas that contributors can expand:
- Nothing about us without us – Disabled people (especially multiply marginalized disabled people) must be involved in designing, governing, and regulating tech.
- Access is not neutral – Access decisions reflect power and values, not just “best practices.”
- No one left behind for convenience – Design against the tendency to treat some disabled people as “edge cases” who can be sacrificed.
- Care over extraction – Use technology to support care, rest, and autonomy rather than only productivity, surveillance, or profit.
- Accountability – Platforms, governments, and companies should be accountable for digital harms they create or profit from.
Digital disability justice sits at the crossroads of:
- Tech & accessibility (/tech)
- Rights & law (/rights)
- Benefits and welfare systems (/benefits)
- Housing, healthcare, and education sections
- Intersectionality pages (/intersectionality)
Contributors can:
- Add case studies (e.g., biased algorithms in benefits systems, surveillance in institutions)
- Link to writing by disabled scholars, activists, and communities on digital justice
- Map campaigns and organizations working at this intersection
The goal is not just to fix individual interfaces, but to shift power toward disabled people in digital spaces.