How Auditors Evaluate WCAG Conformance

Auditors evaluate WCAG conformance by working through each applicable success criterion on a set of representative pages or screens. The process combines screen reader testing, keyboard testing, visual inspection, and code inspection, with automated scan output used as a supporting data point. Each page is checked against every criterion at the target conformance level (typically 2.1 AA or 2.2 AA), and findings are documented with the specific location, the criterion it relates to, and the steps required for remediation.

How Auditors Evaluate WCAG Conformance
Element What It Involves
Methodology Screen reader testing, keyboard testing, visual inspection, code inspection, and review of automated scan output.
Assistive Technology NVDA, JAWS, and VoiceOver across Chrome and Safari environments.
Scope Representative pages and templates, including authenticated states and interactive components.
Standard WCAG 2.1 AA or 2.2 AA, evaluated criterion by criterion at the defined conformance level.
Output A report identifying each issue, its location, the related success criterion, and remediation guidance.

Defining the Scope Before Evaluation Begins

An audit starts with a scoping conversation. The auditor and the organization agree on which pages, templates, and user flows represent the site, including logged-in states and transactional paths like checkout or account management.

Scope also defines the conformance target. Most organizations evaluate against WCAG 2.1 AA, though WCAG 2.2 AA is increasingly requested. The version and level determine which success criteria apply.

Screen Reader Testing

Auditors evaluate WCAG conformance by listening to how each page is announced. Using NVDA or JAWS on Windows and VoiceOver on macOS and iOS, the auditor works through headings, landmarks, links, forms, buttons, images, and interactive components to confirm that every element is announced accurately and in a logical reading order.

This part of the evaluation catches issues that code alone cannot reveal: missing accessible names, incorrect roles, reading order problems, and dynamic content that fails to announce state changes.

Keyboard Testing

The auditor sets the mouse aside and moves through the page using only the keyboard. Every interactive element must be reachable, operable, and visible when focused. Menus must open and close, modal dialogs must trap focus appropriately, and custom components must follow expected keyboard patterns.

Keyboard testing confirms that users who rely on switches, voice control, or screen readers can operate the interface the same way sighted mouse users can.

Visual and Code Inspection

Visual inspection covers text resizing at 200% and reflow at 400%, focus indicators, spacing, and the visual presentation of interactive states. The auditor also checks behavior under browser zoom and on mobile viewport sizes.

Code inspection uses browser developer tools to review HTML, CSS, and ARIA. The auditor confirms that semantic structure matches the visual presentation, that ARIA attributes are used correctly, and that custom widgets expose the right names, roles, and states.

Where Automated Scans Fit In

Scans are part of an audit, but they are not the audit. Automated scans detect approximately 25% of accessibility issues, covering items that can be checked programmatically such as missing alt attributes, empty buttons, and form fields without labels.

The auditor reviews scan output as one input among several. The remaining 75% of WCAG criteria require human judgment: whether alt text is meaningful, whether a heading structure reflects the content, whether an error message is clear, whether a custom component behaves as users expect.

Documenting Findings Against Success Criteria

Each identified issue is mapped to the specific WCAG success criterion it relates to. The report includes the page or component where the issue appears, a description of the problem, the criterion reference, and guidance on how to remediate it.

This structure lets development teams work through the report systematically and gives the organization a clear record of what was evaluated, what was identified, and what conformance level the site reaches after remediation. For ADA Title II entities that reference WCAG 2.1 AA, this documentation also serves as evidence of evaluation against the standard.

A thorough audit produces more than a list of issues. It produces a shared understanding of where the site stands against the standard and what it will take to reach conformance.

Similar Posts