Evaluate Current Compliance Status
To evaluate current compliance status, an organization combines an automated scan with a (manual) audit, reviews existing documentation, and maps findings against WCAG 2.1 AA. The scan flags approximately 25% of accessibility issues. The audit identifies the remaining 75% that requires human evaluation, including screen reader behavior, keyboard operation, and content clarity. Together, these inputs produce a clear picture of where the site stands and what remediation work is needed to reduce legal risk under ADA Title II or Title III.
| Step | What It Covers |
|---|---|
| Automated Scan | Flags approximately 25% of WCAG issues across HTML, CSS, and ARIA. |
| (Manual) Audit | Identifies the remaining 75% through screen reader, keyboard, and code inspection. |
| Document Review | Confirms whether accessibility statements, policies, and prior reports exist. |
| Conformance Mapping | Aligns findings to WCAG 2.1 AA, the standard referenced under ADA Title II. |
Start with a Defined Scope
Before any evaluation work begins, define what is being assessed. List the templates, page types, user flows, and any authenticated areas that represent the site’s core functionality.
A homepage, a product page, a cart, a checkout, an account dashboard, and a contact form often represent a sufficient cross-section for a mid-sized site. Larger sites require broader scope to reflect real user activity.
Conduct an Automated Scan First
An automated scan evaluates HTML, CSS, and ARIA attributes against WCAG success criteria. It produces quick coverage across many pages and identifies issues like missing alternative text references, form labeling problems, and certain document structure errors.
Scans are best treated as a first pass. They cover roughly 25% of WCAG issues, which means a clean scan report does not indicate a conformant site. It indicates that the machine-detectable portion has been reviewed.
Conduct a (Manual) Audit
A (manual) audit is conducted by accessibility professionals using screen reader testing, keyboard testing, code inspection, and visual review across multiple assistive technologies and browsers. The auditor evaluates each in-scope page against WCAG 2.1 AA or 2.2 AA success criteria.
The audit identifies the 75% of issues that scans cannot detect, including focus order problems, ambiguous link text in context, screen reader announcements that do not match the visible interface, and keyboard traps. Most accessibility audits start at 1,000 dollars and range to 3,000 dollars depending on page count and complexity.
Review Existing Documentation
Organizations often have partial accessibility documentation in place. Locate any prior audit reports, remediation logs, accessibility statements, vendor VPATs or ACRs, and internal policies.
Compare the dates and scope of these documents against the current site. A two-year-old audit report on a site that has since launched new templates no longer reflects the current state. Outdated documentation can create a false sense of conformance.
Map Findings to a Standard
Once scan and audit results are in hand, map each issue to a specific WCAG success criterion. This produces a conformance picture rather than a list of disconnected problems.
For ADA Title II entities, WCAG 2.1 AA is the referenced standard. For Title III obligations, organizations commonly align to WCAG 2.1 AA or 2.2 AA to reduce risk, since the ADA does not name a specific technical standard for Title III.
Prioritize Issues by Impact and Risk
Not every issue carries the same weight. Prioritization frameworks use two factors: user impact, meaning how severely the issue blocks people with disabilities from completing tasks, and risk factor, meaning how commonly the issue appears in legal claims.
Issues that block essential functionality, such as inaccessible checkout flows or unlabeled form fields, should be resolved before lower-impact items like decorative element labeling.
Establish a Baseline for Ongoing Monitoring
The evaluation produces a baseline. From there, scheduled scans can monitor for regressions, and follow-up audits can validate that remediation work has produced the intended results.
Authenticated pages, dynamic content, and frequently updated templates benefit from recurring scans on a weekly or monthly schedule. New features should be evaluated before launch rather than after.
