Accessibility Testing for Mobile Apps: Practical Guide (2026)

Accessibility testing is the process of verifying that your app works for people with disabilities. It is a legal requirement in most regulated markets (ADA, EN 301 549, EAA) and a moral one everywher

June 25, 2026 · 4 min read · Testing Guides

Accessibility testing is the process of verifying that your app works for people with disabilities. It is a legal requirement in most regulated markets (ADA, EN 301 549, EAA) and a moral one everywhere. This guide covers what to test, which tools to use, and how to integrate accessibility checks into a mobile release cycle.

What accessibility testing actually covers

Four broad user groups:

  1. Vision — blindness, low vision, color blindness, age-related degeneration
  2. Motor — limited dexterity, tremor, one-handed use, switch access
  3. Hearing — deafness, partial loss
  4. Cognitive — memory impairment, learning differences, attention disorders

Each group needs different affordances. Vision needs screen readers and contrast. Motor needs large targets and keyboard / switch support. Hearing needs captions and visual indicators of audio events. Cognitive needs clarity, consistency, and error recovery.

What to test — the minimum viable checklist

Screen reader compatibility

  1. Every interactive element has an accessible label
  2. Labels are meaningful — "Submit button" not "Button"
  3. Decorative images are marked decorative (not announced)
  4. Dynamic content changes are announced (announceForAccessibility, aria-live)
  5. Reading order matches visual order
  6. Custom views expose proper roles (AccessibilityNodeInfo.setClassName)
  7. Gestures have non-gesture alternatives (swipe → button)

Touch targets

  1. Minimum 48dp × 48dp per Android guidelines
  2. 64dp × 64dp for users with motor impairments (WCAG 2.2 AAA)
  3. Spacing between targets at least 8dp
  4. Floating Action Buttons clear of fixed elements

Contrast

  1. Body text ≥ 4.5:1 against background
  2. Large text (18pt / 14pt bold) ≥ 3:1
  3. UI component borders ≥ 3:1 (WCAG 1.4.11)
  4. Focus indicators visible in all themes

Text

  1. Text respects system font size (up to 200%)
  2. No text in images (or alt text provided)
  3. Line height at least 1.5x font size (WCAG 1.4.12)
  4. Hyphenation does not cut critical info

Motion and animation

  1. Respects reduced-motion system setting
  2. Auto-scrolling content can be paused
  3. No strobe or flash > 3 times per second (photosensitivity)

Orientation and input

  1. Works in portrait AND landscape
  2. Works with external keyboard
  3. Works with switch control (iOS) / Switch Access (Android)
  4. Works with voice control

Forms

  1. Input fields have explicit labels (not just placeholders)
  2. Error messages linked to the failing field programmatically
  3. Validation errors announced on submit
  4. Required fields marked visibly AND programmatically

Testing tools

Android

iOS

Cross-platform

Manual testing approach

Screen reader tour

Enable TalkBack / VoiceOver. Explore the app using only swipes and double-taps. Do not look at the screen while you do it — let the reader guide you. You should be able to complete the core flows (login, main action, navigation) without sighted cues.

Common failures you will catch:

Large font test

Set system font size to maximum (200%) in accessibility settings. Tour the app. Look for:

One-handed test

Switch your phone to the opposite hand. Try to reach every interactive element with your thumb. Notice which flows force two-handed use.

Color blind test

Use a color blindness simulator (Color Oracle, iOS Color Filters). Do charts, status indicators, and error messages still communicate meaning? Rule: never rely on color alone.

Automated testing

Automated tools catch roughly 30-40% of real accessibility issues. The rest — label quality, logical reading order, intent — requires a human. Automation is for regression, not for primary validation.

Android CI integration


// build.gradle
androidTestImplementation "androidx.test.espresso:espresso-accessibility:3.5.1"

// In test setup
@Before fun setUp() {
    AccessibilityChecks.enable().setRunChecksFromRootView(true)
}

Any Espresso test now fails if its view tree contains accessibility violations. Integrate into every UI test and your baseline stops regressing.

iOS CI integration


// In XCTestCase
func testAccessibility() throws {
    let app = XCUIApplication()
    app.launch()
    try app.performAccessibilityAudit()
}

How SUSA does accessibility

SUSA's accessibility_user persona runs the entire exploration with TalkBack semantics in mind. It checks every screen for:

Post-exploration phase runs:

Output is a WCAG scorecard — one row per criterion, pass / fail / not-applicable, with specific violations and screenshots.


susatest-agent test myapp.apk --persona accessibility_user --wcag-level AA

Integrating into release cycle

Every pull request:

Every build:

Every release:

Every quarter:

The bottom line

You cannot fully automate accessibility. You can automate enough that regression stops and new issues get caught quickly. For the rest — label quality, reading order, intent — a human with a screen reader and 30 minutes is the highest-leverage test in the entire QA process.

Make it part of definition-of-done. Every feature ships with an accessibility review. The cost is an hour per feature. The cost of not doing it is a lawsuit, an app store rejection, or a large group of users who simply cannot use your product.

Test Your App Autonomously

Upload your APK or URL. SUSA explores like 10 real users — finds bugs, accessibility violations, and security issues. No scripts.

Try SUSA Free