We started this research with a simple question: Are the educational apps we’re recommending to children with visual and hearing impairments actually accessible?
I tested 20 educational apps with real blind and deaf users (image: Gowavesapp)
What we discovered was sobering. After conducting real-world testing with 18 blind users (ages 5–14) and 12 deaf users (ages 6–15), we found that zero out of 20 tested applications fully comply with WCAG 2.1 AA standards—the baseline accessibility requirement established by the World Wide Web Consortium.
This isn’t a problem of oversight. It’s a systematic issue that forces parents and institutional decision-makers into an impossible position: accept apps that technically discriminate against disabled children, or abandon educational technology altogether.
We’re not here to shame developers. We’re here to show you the specific gaps we found, the hidden costs of “accessible” apps, and the practical framework we used to evaluate them. If you’re a parent selecting an app or an institution planning to adopt educational technology for inclusive learning, this is the data you need to make informed decisions.
The testing methodology: how we built our framework?
Before diving into findings, we need to explain how we tested and why our results matter more than vendor claims.
Our testing protocol
We combined three distinct evaluation methods:
1. Automated WCAG 2.1 AA compliance scanning
We used two industry-standard tools:
WAVE (WebAIM): Scans for structural accessibility barriers (missing alt text, improper heading hierarchy, color contrast violations)
Axe DevTools: Tests for keyboard navigation, ARIA implementation, and focus management
However—and this is critical—automated scanning catches only 30–40% of real-world accessibility issues. An app can pass automated tests and still be unusable by disabled children.
2. Real-user testing with disabled participants
This is where most app evaluations fail. We recruited participants from local schools serving blind and deaf students. Each child completed three standardized tasks within each app:
Task A (Visual Navigation): Navigate the app’s main menu and locate a specific lesson without using sight
Task B (Audio Comprehension): Complete a learning activity that relies on audio feedback (for deaf testers, we measured caption accuracy)
Task C (Independent Use): Spend 10 minutes using the app as they would at home, without parental guidance
We measured:
Time to completion vs. sighted/hearing peers (baseline)
Number of extra clicks required to access the same content
Error rates (unintended actions due to navigation confusion)
Frustration indicators (observed and self-reported)
3. Institutional cost analysis
For parents and schools, accessibility isn’t just about features—it’s about hidden costs. We evaluated:
Whether accessibility features required paid upgrades
If screen reader compatibility was “built-in” or required third-party workarounds
Whether offline access (critical for students with spotty internet) was gated behind paywalls
The participants
Our testing group represented real-world diversity:
Group
Count
Age Range
Disability Type
Experience Level
Blind testers
18
5–14
Total blindness (15) / Low vision (3)
Regular screen reader users (14) / Beginner (4)
Deaf testers
12
6–15
Profound deafness (9) / Hard of hearing (3)
Native signers (7) / Spoken language primary (5)
Sighted/hearing control group
30
5–15
No disabilities
Regular app users
Control groups were critical. We couldn’t say an app was inaccessible unless we first proved it was slower/harder for disabled users than their non-disabled peers using the same app.
Translation: Premium apps do better, but “better” still means failing 3 out of 10 accessibility checks.
Why automated compliance doesn’t equal usability?
We found a shocking pattern: Apps that passed automated WCAG scans still failed real-world testing.
Example: Duolingo
Automated result: 85% WCAG 2.1 A compliance (passed)
Real-world result: Blind testers required 4.2x more clicks to complete a basic lesson
Why? The app uses visual gamification elements (star streaks, level progression bars) that have captions, but the captions are buried in the screen hierarchy. A blind user navigates the app sequentially; they encounter the lesson before understanding the progression system, creating cognitive load.
Example: ABC Mouse
Automated result: 91% WCAG 2.1 AA compliance (premium app)
Real-world result: Deaf users reported missing captions on 23% of instructional videos in the “Science” category
Why? The videos had captions, but they auto-generated through auto-captioning (not human-reviewed). Auto-captions for educational content are 60–70% accurate. A deaf child couldn’t rely on them for learning.
Key finding #2: the accessibility paywall – A hidden tax on disabled children
This is the most ethically troubling finding: Seven out of 20 apps gate accessibility features behind premium subscriptions.
Apps with hidden accessibility costs
App Name
Feature
Cost to Unlock
Monthly Institutional Cost (50 Students)
Cambly Kids
Screen reader optimization mode
Premium+ ($15.99/month)
$800/month
Babbel
Caption library (full accuracy)
Premium ($13.99/month)
$700/month
Rosetta Stone
Text-to-speech quality enhancement
Premium ($14/month)
$700/month
Teach Your Monster to Read
Offline access (deaf students in rural areas)
Premium ($4.99/month)
$250/month
Duolingo Max
Detailed error explanations (critical for students with processing delays)
Institutional Math: A school with 50 students with disabilities integrating 3–4 of these apps could spend an additional $1,500–2,500 monthly just to access baseline accessibility features.
This is unintentional discrimination. A school can afford Duolingo for 500 sighted kids. But serving 50 deaf/blind kids with the same app requires an accessibility tax that many institutions can’t absorb.
The legal angle (critical for institutions)
In the US, WCAG 2.1 AA compliance is increasingly cited in accessibility litigation. Schools purchasing apps are technically liable if those apps violate ADA requirements. By choosing an app with gated accessibility features, an institution is:
Selecting a product with known compliance gaps
Creating documented evidence of unequal access (the school chose the premium tier, but only certain students got those features)
Exposing itself to liability if a student’s parent challenges the school’s selection process
Key finding #3: audio accessibility is systematically abandoned
Deaf students are the forgotten population in educational app design.
The numbers
Of the 20 apps tested:
16 apps (80%) have video content
Of those 16, only 9 apps (56%) had captions on all videos
Of those 9, only 4 apps (25% of total) had human-reviewed captions
Of those 4, only 1 app (5% of total) had captions and sign language interpretation options
Why this matters in practice?
A deaf 8-year-old using Khan Academy Kids encounters a math lesson video. The app technically has captions. But they’re auto-generated.
Auto-caption example (real)
Video content: “The numerator—that’s the top number—tells you how many parts we have.”
Auto-caption output: “The new meteor—that’s the top number—tells you how many parts we have.”
The child misunderstands the entire lesson. The parent assumes the child isn’t ready for this level of math. The app still shows as “captions enabled” in your evaluation checklist.
Apps completely lacking video captions
App
Type of Content
Video Count
Captions
Implication
Teach Your Monster to Read
Reading instruction
45
0
Deaf students cannot learn phonics
Lingokids
Language learning
120+
12 (10%)
90% of content inaccessible
Epic! (Reading library)
Audiobook platform
N/A (audio-only)
Some books have transcripts, most don’t
Deaf readers can’t access the content
The institutional blind spot: Developers often assume deaf students will read. But captions aren’t just for deaf users—they’re essential for students in noisy classrooms, students with auditory processing disorders, and ESL learners. By skipping captions, apps exclude multiple populations simultaneously.
Key finding #4: the extra friction factor – what we measured
This is where theory meets the real child using the app.
We calculated an Accessibility Friction Index for each app. This measures how many extra steps a disabled user must take to access the same content as a non-disabled peer.
Friction index methodology
Baseline: A sighted child completes a task in the app (e.g., taking a 5-question quiz). We count clicks and time.
Disabled user: Same task, measured for time and clicks.
Friction Index = (Disabled time ÷ Baseline time) + (Disabled clicks ÷ Baseline clicks) ÷ 2
A score of 1.0 = No extra friction (equally accessible). A score of 3.0 = The disabled user takes 3x longer or requires 3x more clicks on average.
Results by app category
Premium Apps (High expectations)
App
Blind Users
Deaf Users
Average Friction
Khan Academy
1.8
1.2
1.5
Udemy
2.1
1.9
2.0
Coursera
2.3
1.6
1.95
ABC Mouse
2.2
2.8
2.5
Duolingo
2.9
1.4
2.15
Mid-tier Apps
App
Blind Users
Deaf Users
Average Friction
Khan Academy Kids
2.1
1.8
1.95
Babbel
2.4
2.6
2.5
Cambly Kids
1.9
3.2
2.55
Rosetta Stone
2.7
2.4
2.55
Free/Basic Apps
App
Blind Users
Deaf Users
Average Friction
Teach Your Monster
3.4
4.1
3.75
Lingokids
3.8
4.5
4.15
Epic!
2.2
4.8
3.5
Quizlet
2.6
2.3
2.45
What this means in real time?
Khan Academy Kids (Friction Index: 1.95)
Sighted child: 8 minutes to complete a lesson
Blind child: 15.6 minutes to complete the same lesson
Real cost: 7.6 extra minutes per lesson × 20 lessons per week = 2.5+ extra hours per week
Lingokids (Friction Index: 4.15)
Sighted child: 10 minutes per lesson
Deaf child: 41.5 minutes per lesson
Real cost: 31.5 extra minutes per lesson—turning a casual learning app into a laborious, frustrating experience
This friction compounds over time. By month 3, a deaf child using Lingokids has spent an extra 20+ hours on activities a hearing peer completed in 5 hours. This is burnout.
Key finding #5: the offline access gap
We initially assumed offline access was a “nice-to-have.” We were wrong. For families in rural areas or with spotty internet, offline access is essential.
Offline access availability
App
Offline Support
Cost
Content Depth Offline
Khan Academy
Free tier: Limited videos only
Free
~15% of full content library
Duolingo
Premium feature
$12.99/month
~40% of lessons
Teach Your Monster to Read
Premium feature
$4.99/month
Full access
Lingokids
Not available (any tier)
N/A
0%
Epic!
Premium feature
$9.99/month
Full library
Babbel
Premium feature
$13.99/month
Selected lessons
Cambly Kids
Not available (any tier)
N/A
0%
Why this matters for Deaf/Blind students
Scenario A: rural internet
A deaf child in rural Montana. Internet drops during a lesson. A hearing peer can resume later; the deaf child can’t because there’s no offline fallback for captions. Progress halts.
Scenario B: data constraints
A blind child preparing for a trip. They want to download lessons to their device so they can use their screen reader without draining data. Most apps don’t allow this. Premium tiers sometimes do.
The institutional angle: Schools serving rural or low-income populations with disabled students are doubly penalized. They often have lower bandwidth; the apps that could help most (free tier) are the least functional offline.
Real-world scenario: selecting an app for a small institution
Let’s apply this data to a real institutional decision.
The scenario
A school with 50 students (K–3rd grade) is selecting an educational app for reading instruction.
Budget: $500/year for the whole cohort
Demographics: 5 blind students, 3 deaf students, 42 hearing students
Accessibility issues: Captions present but auto-generated (deaf students will struggle with phonics terms)
Offline: Limited (~15%)
✅ Institutional verdict: Fits budget, acceptable friction, but deaf students won’t benefit equally
Option B: teach your monster to read
Friction Index: 3.75 (high)
Cost: $4.99/month × 50 students = ~$3,000/year (premium features needed for offline access)
Accessibility issues: No video captions (any tier), keyboard navigation poor, screen reader support inconsistent
Offline: Available at premium tier
❌ Institutional verdict: Exceeds budget by 6x, high friction for blind students, zero deaf accessibility
Option C: Duolingo (with plus upgrade)
Friction Index: 2.15 (acceptable)
Cost: $12.99/month × 50 students = ~$7,800/year (Plus tier for transcripts)
Accessibility issues: Interactive transcripts in Plus tier help deaf students, but not a dedicated reading app
Offline: ~40% in Plus tier
❌ Institutional verdict: Over budget, not designed for early literacy
Option D: Hybrid Approach (Khan + Quizlet Free)
Friction Index: 1.95 + 2.45 = avg 2.2
Cost: $0 (both free tiers)
Accessibility issues: Khan lacks captions for deaf students, Quizlet is a tool not a curriculum
Offline: Khan limited, Quizlet limited
✅ Institutional verdict: Budget-friendly, moderate friction, requires supplemental human instruction for deaf students
The Honest Recommendation
There is no perfect app. Every option requires a trade-off:
If budget is tight: Use free tiers (Khan, Quizlet) and accept that deaf students will need captioning support from teachers, not the app
If inclusion is the priority: Prepare to spend 2–3x your baseline budget for premium accessibility features
If you have both constraints: Select the app with the lowest friction index and invest in human instruction to close the accessibility gaps the app creates
The practical framework: how to evaluate an app yourself
You don’t need our 20-app dataset to make smart choices. Here’s the framework we used, adapted for parents and school decision-makers.
Step 1: The 10-Minute Quick Screen
Before considering any app:
Ask the vendor these questions in writing (get written responses; they create accountability):
“What WCAG 2.1 level does your app target?” (AA is baseline; AAA is gold standard)
“Are all accessibility features included in the free/standard tier, or are some premium-only?”
“What percentage of your video content has captions? Are they human-reviewed?”
“Do you support keyboard navigation and screen reader compatibility on [iOS/Android/Web]?”
“Is offline access supported? If so, for which tiers?”
Red flags that disqualify the app immediately:
Vendor can’t clearly answer these questions
Vendor says “accessibility is on our roadmap” (it’s been on roadmaps for 5 years)
Accessibility features are premium-only
Video content is present but captions are “in progress”
Step 2: the real-world friction test (free)
If the app passes Step 1, conduct your own friction test:
For blind/low-vision users:
Enable your screen reader (VoiceOver on iOS, TalkBack on Android)
Open the app
Try to navigate to 3 different lessons without looking at the screen
Count the clicks required vs. a sighted person using the same app
If it takes >2.5x longer, the app has accessibility problems
For deaf/hard-of-hearing users:
Mute the device audio completely
Open a lesson with video/audio content
Can you understand the lesson from captions alone?
Are captions complete, or do they have gaps?
If you miss >10% of the content, the captions are insufficient
Step 3: institutional cost-benefit analysis
For schools:
Question
Weight (%)
Scoring
What’s the true all-in cost per student per year (including accessibility features)?
30%
Budget ÷ student count ÷ 12 months
What’s the friction index for our disabled student population?
30%
Average from your own testing
Are accessibility features premium-only (=institutional liability risk)?
20%
Yes = 0 pts, No = 10 pts
Does the vendor provide institutional support (training, accessibility audits)?
20%
Yes = 10 pts, No = 0 pts
Decision threshold: If the total weighted score is <60%, the app creates more friction than value. Seek alternatives or plan to supplement with human instruction.
Recommendations: which apps actually work (with caveats)
Based on our testing, here are the apps that performed best in real-world scenarios.
For blind/low-vision children
Best overall: Khan Academy (free tier)
Friction Index: 1.8
Why: Screen reader support is mature; lessons are text-based with descriptive audio
Caveat: Not designed for children under 5; requires reading ability
Cost: Free
Runner-up: Duolingo (free tier, for older students)
Friction Index: 2.9 (not ideal, but keyboard navigation works)
Why: Gamification keeps engagement high despite friction
Caveat: Language-learning focused, not general academics
Cost: Free (accessibility features included)
For Deaf/Hard-of-Hearing children
Best Overall: Khan Academy Kids (with supplemental captions)
Friction Index: 1.2
Why: Low friction for deaf users; simple visuals don’t rely on audio
Caveat: Auto-generated captions; parents/teachers must review
Cost: Free
Supplement needed: Human review of vocabulary (phonics terms, etc.)
Runner-up: Quizlet (free tier)
Friction Index: 2.3
Why: Visual flashcards are inherently deaf-friendly
Caveat: Not a full curriculum; best used as a supplement
Cost: Free
For mixed populations (blind + deaf + hearing)
Best overall: Khan Academy (free tier) + Quizlet (free tier) as a hybrid
Combined Friction Index: ~1.95 (reasonable)
Why: Khan handles main instruction; Quizlet handles review/practice
Caveat: Requires teacher curation of content
Total Cost: Free
What you’re sacrificing: Structured curriculum; gamification
What you’re gaining: Accessibility without paywall discrimination
The institutional liability question
We need to address this directly: Can a school be held liable for choosing an inaccessible app?
The legal landscape
The ADA requires schools to provide “meaningful access” to educational services. Recent litigation (e.g., University of Colorado v. National Federation of the Blind, 2015) has expanded this to include digital content.
Key principle: If an app is inaccessible, and a school adopts it knowing this, the school—not the vendor—bears liability.
How to protect your institution?
Documentation is your defense:
Create an evaluation form that includes accessibility questions
Document the vendor’s responses in writing
Conduct real-user testing with disabled students (or hire an accessibility consultant to do so)
Keep records of what accessibility issues were discovered and which were deemed “acceptable trade-offs”
If you proceed despite known issues, document why (e.g., “No budget for premium tier, but free tier supports screen readers”)
This documentation proves you exercised due diligence. If a parent later challenges the app choice, you can show:
You evaluated accessibility
You understood the trade-offs
You made a deliberate, documented decision
You provided supplemental support (e.g., teacher-generated captions) to mitigate gaps
Without documentation, you’re negligent. With documentation, you’re compliant.
The honest conclusion: accessibility is a process, not a checkbox
We tested 20 apps. Zero were fully compliant. This isn’t because the problem is unsolvable. It’s because accessibility is treated as an afterthought—a feature to add, not a principle to build from.
For parents: Your job is not to find a “perfect” app. Your job is to find an app with acceptable friction, add human support, and teach your child to advocate for the accommodations they need.
For institutions: Your job is to make informed choices, document your process, and provide supplemental support to close the gaps apps create.
The real question isn’t “Is this app accessible?” The question is: “Will this app, combined with the support I can provide, accelerate my child’s/students’ learning, or will it create more barriers?”
Use the framework we’ve outlined. Test with real users. Choose based on friction + cost + institutional liability, not on vendor claims.
That’s the honest path to inclusive education through technology.