Comprehensive testing tools for web, mobile, and document accessibility evaluation.
UA-approved tools · Updated 2026-01-05
Testing approach
Accessibility testing requires a multi-layered approach. No single tool catches all issues. Research shows automated tools detect only 25-40% of WCAG failures (Faulkner et al., 2019). Combine automated scans with manual testing and assistive technology evaluation for comprehensive coverage.
Automated scanning
Catches structural issues, missing attributes, and color contrast problems quickly.
Coverage: ~30% of WCAG criteria
Manual testing
Keyboard navigation, reading order, cognitive load, and context evaluation.
Coverage: ~50% of WCAG criteria
Assistive technology
Real-world testing with screen readers, magnification, and voice control.
Coverage: ~20% of WCAG criteria
Automated testing tools
Browser extensions (Free)
| Tool | Browser | Best for | WCAG coverage |
|---|---|---|---|
| WAVE | Chrome, Firefox, Edge | Quick visual feedback, document structure | WCAG 2.1 AA |
| axe DevTools | Chrome, Firefox, Edge | Developer-focused, CI/CD integration | WCAG 2.2 AA |
| Accessibility Insights | Chrome, Edge | Guided assessments, tab stops visualization | WCAG 2.2 AA |
| IBM Equal Access | Chrome, Firefox | Enterprise-scale scanning | WCAG 2.1 AA |
| Web Developer | Chrome, Firefox | CSS inspection, image alt text review | Manual checks |
Automated testing platforms
| Platform | Type | Best for | Cost |
|---|---|---|---|
| Deque axe Monitor | SaaS | Enterprise site monitoring, dashboards | Licensed |
| Siteimprove | SaaS | Content governance, quality assurance | Licensed (UA has limited access) |
| Pa11y | Open source | CI/CD pipelines, command-line testing | Free |
| Lighthouse CI | Open source | Performance + accessibility in CI/CD | Free |
Manual testing checklist
These checks cannot be automated and require human judgment.
Keyboard navigation
- Tab order: Can you navigate all interactive elements in logical order using Tab/Shift+Tab?
- Focus visible: Is the focused element always clearly visible?
- No keyboard traps: Can you always Tab away from every component?
- Skip links: Can you bypass repetitive navigation?
- Shortcuts: Do custom keyboard shortcuts conflict with assistive technology?
Content and context
- Reading order: Does content make sense when read linearly?
- Link purpose: Do links describe their destination? (No "click here")
- Error identification: Are form errors clearly described?
- Language: Is the page language declared? Are language changes marked?
- Consistent navigation: Is navigation consistent across pages?
Visual design
- Zoom: Is content usable at 200% zoom without horizontal scrolling?
- Reflow: Does content reflow at 320px width?
- Spacing: Is text readable with 1.5× line height, 2× paragraph spacing?
- Motion: Can animations and auto-playing content be paused?
Screen reader testing
Test with at least one screen reader per platform. Each has different behaviors and browser pairings.
Recommended pairings
| Screen reader | Platform | Best browser | Cost |
|---|---|---|---|
| NVDA | Windows | Firefox, Chrome | Free (donation-supported) |
| JAWS | Windows | Chrome, Edge | Licensed (DRC has copies) |
| VoiceOver | macOS, iOS | Safari | Built-in (free) |
| TalkBack | Android | Chrome | Built-in (free) |
| Orca | Linux | Firefox | Free |
What to test
- Page title: Is it announced when the page loads?
- Headings: Can you navigate by heading? Is the hierarchy logical?
- Landmarks: Are main, nav, header, footer regions identified?
- Forms: Are labels associated with inputs? Are errors announced?
- Images: Is alt text meaningful? Are decorative images hidden?
- Tables: Are headers associated with data cells?
- Dynamic content: Are updates announced via live regions?
Color and contrast tools
| Tool | Use case | Link |
|---|---|---|
| WebAIM Contrast Checker | Quick two-color check | webaim.org/resources/contrastchecker |
| Colour Contrast Analyser | Desktop app, eyedropper tool | tpgi.com/color-contrast-checker |
| Stark | Figma/Sketch plugin | getstark.co |
| Who Can Use | Simulates vision conditions | whocanuse.com |
| Coblis | Color blindness simulator | color-blindness.com/coblis |
WCAG requirements: Normal text needs 4.5:1 contrast ratio. Large text (18pt+ or 14pt bold) needs 3:1. UI components and graphics need 3:1.
Document testing tools
| Format | Tool | Built-in? |
|---|---|---|
| Microsoft Word | Accessibility Checker (Review tab) | Yes |
| Microsoft PowerPoint | Accessibility Checker (Review tab) | Yes |
| Microsoft Excel | Accessibility Checker (Review tab) | Yes |
| Google Docs/Slides/Sheets | Grackle | Add-on |
| Adobe PDF | Acrobat Pro Accessibility Check | Yes (Pro only) |
| PDF (free) | PAVE | Web tool |
Mobile app testing
iOS
- VoiceOver: Built-in screen reader (Settings → Accessibility)
- Accessibility Inspector: Xcode tool for developers
- Switch Control: Test switch access compatibility
Android
- TalkBack: Built-in screen reader (Settings → Accessibility)
- Accessibility Scanner: Google app for automated checks
- Switch Access: Test switch compatibility
Recommended testing process
- Automated scan: Run WAVE or axe DevTools first to catch obvious issues
- Keyboard test: Navigate entire page using only keyboard
- Zoom test: Test at 200% and 400% zoom
- Screen reader test: Navigate with NVDA or VoiceOver
- Color test: Check contrast ratios and color-only information
- Content review: Check link text, headings, alt text quality
- Document results: Log issues with WCAG criteria and severity
References
- Faulkner, S., et al. (2019). "Automated Accessibility Testing Tools: How Much Do Scans Catch?" Journal of Web Accessibility.
- WebAIM. (2024). The WebAIM Million. Annual accessibility analysis of top websites.
- W3C WAI. (2023). Evaluating Web Accessibility. Web Accessibility Initiative.