ESC
Type to search guides, tutorials, and reference documentation.
Verified by Garnet Grid

Accessibility Testing

Build accessible applications by integrating a11y testing into your development workflow. Covers WCAG compliance, automated scanning, manual testing, screen reader testing, and CI/CD integration.

Accessibility testing verifies that your application works for everyone — including users with visual, auditory, motor, or cognitive disabilities. It’s not a nice-to-have: in many jurisdictions it’s a legal requirement, and for enterprise products it’s increasingly a contract requirement. Beyond compliance, accessible software is better software: it’s more keyboard-navigable, more screen-reader-friendly, and more robust across devices.


WCAG Compliance Levels

LevelRequirementCoverage
AMinimum accessibilityBasic: alt text, keyboard navigation, no seizure triggers
AAStandard (most legal requirements)Color contrast, resize support, consistent navigation
AAAEnhancedSign language, extended audio descriptions, cognitive aids

Most organizations target WCAG 2.1 AA — this is the standard referenced in ADA lawsuits, EU accessibility directives, and enterprise procurement.


Automated Testing Tools

ToolIntegrationCoverageBest For
axe-coreBrowser extension, CI, Playwright~57% of WCAG issuesDeveloper-friendly, extensible
LighthouseChrome DevTools, CISubset of a11y checksQuick audits with performance
pa11yCLI, CIWCAG 2.1 AACI pipeline integration
WAVEBrowser extensionVisual overlayDesigner/QA review
Deque axe DevToolsChrome/Firefox extensionExtended axe rulesProfessional auditing

What Automation Catches vs. What It Misses

Automation Catches (~40%)Requires Manual Testing (~60%)
Missing alt textAlt text quality and accuracy
Color contrast ratiosMeaningful color usage
Missing form labelsLabel clarity and context
Missing ARIA attributesARIA used correctly
Heading hierarchy violationsHeading text makes sense
Keyboard trap detectionLogical tab order
Missing lang attributeContent language accuracy

Manual Testing Procedures

Keyboard Navigation

TestExpected Behavior
Tab through all interactive elementsFocus moves in logical order
Shift+Tab to navigate backwardFocus moves backward correctly
Enter/Space to activate buttonsAll buttons respond to keyboard
Escape to close modals/dropdownsOverlay closes, focus returns to trigger
Arrow keys in menus/listsNavigation within grouped elements
Focus visible on all elementsClear visual focus indicator (not just browser default)

Screen Reader Testing

Screen ReaderPlatformBrowser
NVDAWindowsChrome, Firefox
JAWSWindowsChrome, IE/Edge
VoiceOvermacOS, iOSSafari
TalkBackAndroidChrome
NarratorWindowsEdge

Screen Reader Test Scenarios

ScenarioWhat to Verify
Navigate headings (H key)Heading hierarchy makes content scannable
Navigate links (K key)Link text is descriptive (not “click here”)
Navigate formsLabels announce correctly, error messages read
Navigate tablesHeaders associated with cells
Dynamic content updatesARIA live regions announce changes
ImagesAlt text reads correctly, decorative images hidden

Color and Contrast

ElementMinimum Contrast (AA)Enhanced (AAA)
Normal text (< 18pt)4.5:17:1
Large text (≥ 18pt or 14pt bold)3:14.5:1
UI components and graphics3:13:1
Focus indicators3:13:1
ToolWhat It Does
Contrast checker (WebAIM)Verify specific color pairs
Colour Contrast AnalyserDesktop eyedropper tool
Stark (Figma plugin)Design-time contrast checking

ARIA Best Practices

RuleExample
Prefer semantic HTML firstUse <button> not <div role="button">
Don’t override native semanticsDon’t add role="heading" to a <h2>
All ARIA states maintainedaria-expanded toggles with accordion state
Live regions for dynamic contentaria-live="polite" for toast notifications
Hidden content properly hiddenaria-hidden="true" for decorative elements

CI/CD Integration

Pull Request:
    → axe-core scan (Playwright integration)
    → Block on any critical a11y violations
    → Report new warnings in PR comment

Pre-release:
    → Full Lighthouse a11y audit
    → pa11y scan of all public pages
    → Manual keyboard + screen reader spot check

Quarterly:
    → Full manual audit with screen reader
    → Third-party accessibility audit (if required by contract)

Anti-Patterns

Anti-PatternProblemFix
Relying only on automated toolsCatches < 50% of real issuesCombine automated + manual + screen reader testing
Adding ARIA to fix everythingARIA misuse creates worse experienceUse semantic HTML first, ARIA only when needed
Color as the only indicatorColor-blind users can’t distinguishAdd icons, patterns, or text alongside color
Treating a11y as a post-launch auditExpensive to retrofitTest during development on every PR
Ignoring keyboard navigationPower users and screen readers blockedEvery interactive element must be keyboard accessible

Checklist

  • WCAG 2.1 AA target documented and agreed upon
  • axe-core integrated into CI pipeline (block on critical violations)
  • Keyboard navigation tested for all interactive flows
  • Screen reader testing performed quarterly (VoiceOver + NVDA minimum)
  • Color contrast meets 4.5:1 for normal text, 3:1 for large text
  • All images have meaningful alt text (or aria-hidden for decorative)
  • Form inputs have associated labels
  • Focus management: logical tab order, visible focus indicator, focus trapping in modals
  • ARIA live regions for dynamic content updates
  • Skip navigation link on all pages

:::note[Source] This guide is derived from operational intelligence at Garnet Grid Consulting. For accessibility consulting, visit garnetgrid.com. :::

Jakub Dimitri Rezayev
Jakub Dimitri Rezayev
Founder & Chief Architect • Garnet Grid Consulting

Jakub holds an M.S. in Customer Intelligence & Analytics and a B.S. in Finance & Computer Science from Pace University. With deep expertise spanning D365 F&O, Azure, Power BI, and AI/ML systems, he architects enterprise solutions that bridge legacy systems and modern technology — and has led multi-million dollar ERP implementations for Fortune 500 supply chains.

View Full Profile →