Accessibility Testing for Mobile Apps: Practical Guide (2026)
Accessibility testing is the process of verifying that your app works for people with disabilities. It is a legal requirement in most regulated markets (ADA, EN 301 549, EAA) and a moral one everywher
Accessibility testing is the process of verifying that your app works for people with disabilities. It is a legal requirement in most regulated markets (ADA, EN 301 549, EAA) and a moral one everywhere. This guide covers what to test, which tools to use, and how to integrate accessibility checks into a mobile release cycle.
What accessibility testing actually covers
Four broad user groups:
- Vision — blindness, low vision, color blindness, age-related degeneration
- Motor — limited dexterity, tremor, one-handed use, switch access
- Hearing — deafness, partial loss
- Cognitive — memory impairment, learning differences, attention disorders
Each group needs different affordances. Vision needs screen readers and contrast. Motor needs large targets and keyboard / switch support. Hearing needs captions and visual indicators of audio events. Cognitive needs clarity, consistency, and error recovery.
What to test — the minimum viable checklist
Screen reader compatibility
- Every interactive element has an accessible label
- Labels are meaningful — "Submit button" not "Button"
- Decorative images are marked decorative (not announced)
- Dynamic content changes are announced (
announceForAccessibility,aria-live) - Reading order matches visual order
- Custom views expose proper roles (
AccessibilityNodeInfo.setClassName) - Gestures have non-gesture alternatives (swipe → button)
Touch targets
- Minimum 48dp × 48dp per Android guidelines
- 64dp × 64dp for users with motor impairments (WCAG 2.2 AAA)
- Spacing between targets at least 8dp
- Floating Action Buttons clear of fixed elements
Contrast
- Body text ≥ 4.5:1 against background
- Large text (18pt / 14pt bold) ≥ 3:1
- UI component borders ≥ 3:1 (WCAG 1.4.11)
- Focus indicators visible in all themes
Text
- Text respects system font size (up to 200%)
- No text in images (or alt text provided)
- Line height at least 1.5x font size (WCAG 1.4.12)
- Hyphenation does not cut critical info
Motion and animation
- Respects reduced-motion system setting
- Auto-scrolling content can be paused
- No strobe or flash > 3 times per second (photosensitivity)
Orientation and input
- Works in portrait AND landscape
- Works with external keyboard
- Works with switch control (iOS) / Switch Access (Android)
- Works with voice control
Forms
- Input fields have explicit labels (not just placeholders)
- Error messages linked to the failing field programmatically
- Validation errors announced on submit
- Required fields marked visibly AND programmatically
Testing tools
Android
- Accessibility Scanner (Google) — scans one screen at a time, flags contrast, touch targets, missing labels, tap target overlaps
- TalkBack — the actual screen reader users use. Enable in Settings, tour your app, listen
- Switch Access — enable and try to operate the app with one button
- Espresso + AccessibilityChecks —
AccessibilityChecks.enable().setRunChecksFromRootView(true)in instrumented tests; fails the test on violations - Lint a11y checks —
android:contentDescriptionmissing,android:labelFormissing
iOS
- VoiceOver — built in, enable via Triple-click Home or Side button
- Accessibility Inspector (Xcode) — runtime audit, includes contrast and label checks
- XCUITest accessibility audit —
XCUIApplication().performAccessibilityAudit()(Xcode 15+), covers 6 audit categories - Switch Control — motor accessibility testing
Cross-platform
- axe DevTools Mobile — commercial, deeper than free tools
- AccessibleOnly — accessibility overlay for Android
- Color Oracle — desktop color blindness simulator (useful for designers, not testers directly)
Manual testing approach
Screen reader tour
Enable TalkBack / VoiceOver. Explore the app using only swipes and double-taps. Do not look at the screen while you do it — let the reader guide you. You should be able to complete the core flows (login, main action, navigation) without sighted cues.
Common failures you will catch:
- Unlabeled icons (reader says "button")
- Reading order that jumps around
- Modal dialogs that do not trap focus
- Custom controls with no role (reader says "view")
Large font test
Set system font size to maximum (200%) in accessibility settings. Tour the app. Look for:
- Text clipped or truncated
- Buttons overflowing their containers
- Overlapping elements
- Important CTAs hidden below the fold
One-handed test
Switch your phone to the opposite hand. Try to reach every interactive element with your thumb. Notice which flows force two-handed use.
Color blind test
Use a color blindness simulator (Color Oracle, iOS Color Filters). Do charts, status indicators, and error messages still communicate meaning? Rule: never rely on color alone.
Automated testing
Automated tools catch roughly 30-40% of real accessibility issues. The rest — label quality, logical reading order, intent — requires a human. Automation is for regression, not for primary validation.
Android CI integration
// build.gradle
androidTestImplementation "androidx.test.espresso:espresso-accessibility:3.5.1"
// In test setup
@Before fun setUp() {
AccessibilityChecks.enable().setRunChecksFromRootView(true)
}
Any Espresso test now fails if its view tree contains accessibility violations. Integrate into every UI test and your baseline stops regressing.
iOS CI integration
// In XCTestCase
func testAccessibility() throws {
let app = XCUIApplication()
app.launch()
try app.performAccessibilityAudit()
}
How SUSA does accessibility
SUSA's accessibility_user persona runs the entire exploration with TalkBack semantics in mind. It checks every screen for:
- Missing
contentDescriptionon interactive elements (critical) - Touch targets below 64dp (high)
- Contrast ratios below 4.5:1 (high, uses pixel sampling)
- Focus order mismatches (medium)
- Font sizes below 14sp for body copy (medium)
- Complex gestures without button alternatives (medium)
- Dynamic content not announced (detected via state change without announcement)
Post-exploration phase runs:
- Full TalkBack exploration of discovered screens
- Text scaling verification at 200%
- Keyboard navigation of all interactive flows (web)
- axe-core scan (web)
Output is a WCAG scorecard — one row per criterion, pass / fail / not-applicable, with specific violations and screenshots.
susatest-agent test myapp.apk --persona accessibility_user --wcag-level AA
Integrating into release cycle
Every pull request:
- Espresso/XCUITest accessibility checks run (automated)
- Lint a11y warnings cause build failure (automated)
Every build:
- SUSA accessibility_user run (automated)
- Accessibility Scanner pass on new screens (manual)
Every release:
- Full TalkBack / VoiceOver manual walkthrough of key flows
- Usability test with at least one user of assistive technology (monthly, not per-release)
Every quarter:
- External accessibility audit by a specialist firm (optional but recommended for regulated industries)
The bottom line
You cannot fully automate accessibility. You can automate enough that regression stops and new issues get caught quickly. For the rest — label quality, reading order, intent — a human with a screen reader and 30 minutes is the highest-leverage test in the entire QA process.
Make it part of definition-of-done. Every feature ships with an accessibility review. The cost is an hour per feature. The cost of not doing it is a lawsuit, an app store rejection, or a large group of users who simply cannot use your product.
Test Your App Autonomously
Upload your APK or URL. SUSA explores like 10 real users — finds bugs, accessibility violations, and security issues. No scripts.
Try SUSA Free