Over 70% of the Web Content Accessibility Guidelines (WCAG) Success Criteria require Manual Review.
By Giacomo Petri and Christian Federici
The debate around what level of accessibility validation can be achieved with automated testing tools has been on-going for 20 years – with a range of different opinions. The simple fact is automatic accessibility testing alone is not enough; and the WCAG 2.1 update has increased the importance of conducting manual accessibility review and testing with assistive technology to ensure full conformance.
Below is a handy chart that breaks down WCAG 2.0 and 2.1 Success Criteria identifying if Automatic testing can be used, or if Manual Review is required. The numbers tell the hard truth that manual review or testing with assistive technology is fundamental. That means more effort than most companies want to hear and more effort than most accessibility industry advocates want to admit, as it is placing a large burden on companies.
With that said, Automatic testing is a great starting point and an important part of continued on-going accessibility; and any test that can be automated should be for efficiency. But why?
Automatic testing can pick up a lot of issues, fast, maybe even the majority of issues in quantity, when looking to achieve WCAG 2.0 or 2.1 AA compliance. It is also the most popular way for law firms or industry advocates to try and identify sites not doing well. So, testing and passing automatic tests will improve the accessibility of a site and reduces the potential for legal demand letters but does not ensure conformance with WCAG or guarantee an assistive user can use the site.
Where does Automatic Accessibility Testing fall short?
WCAG 2.0 and 2.1 establishes the guidelines and suggests the techniques to test pages through a number of Success Criteria at each level of conformance (A, AA, or AAA). An automated test can only be used for a small number of these Success Criteria, leaving the majority of criteria requiring a manual review. Many automatic testing tools try to overstate their completeness by stating how many automatic tests it can run, but rarely do they mention the number of Success Criteria that can NOT be fully tested automatically leading to misconceptions and incomplete results.
|Testing Method Needed|
|Level A||WCAG||Fully Automatic||Auto & Manual Verification Needed||All Manual||UsableNet Audit and Release QA Service|
|1.1.1 Non-text Content||2.0 & 2.1||X||AQA* & verify text & screen reader|
|1.2.1 Audio-only and Video-only (Prerecorded)||2.0 & 2.1||X||Test with sound/screen off|
|1.2.2 Captions (Prerecorded)||2.0 & 2.1||X||Test with sound/screen off|
|1.2.3 Audio Description or Media Alternative (Prerecorded)||2.0 & 2.1||X||Test with sound/screen off|
|1.3.1 Info and Relationships||2.0 & 2.1||X||AQA & verify text & verify code & screen reader|
|1.3.2 Meaningful Sequence||2.0 & 2.1||X||Keyboard & verify code & screen reader|
|1.3.3 Sensory Characteristics||2.0 & 2.1||X||Test with sound/screen off|
|1.4.1 Use of Color||2.0 & 2.1||X||Visual verification|
|1.4.2 Audio Control||2.0 & 2.1||X||AQA & Keyboard & Screen Reader|
|2.1.1 Keyboard||2.0 & 2.1||X||AQA & Keyboard & Screen Reader|
|2.1.2 No Keyboard Trap||2.0 & 2.1||X||Keyboard & Screen Reader|
|2.1.4 Character Key Shortcuts||2.1||X||Keyboard & Screen Reader|
|2.2.1 Timing Adjustable||2.0 & 2.1||X||Keyboard & Screen Reader|
|2.2.2 Pause, Stop, Hide||2.0 & 2.1||X||AQA & Keyboard & Visual verification|
|2.3.1 Three Flashes or Below Threshold||2.0 & 2.1||X||Visual verification|
|2.4.1 Bypass Blocks||2.0 & 2.1||X||AQA|
|2.4.2 Page Titled||2.0 & 2.1||X||AQA & verify text|
|2.4.3 Focus Order||2.0 & 2.1||X||Keyboard & Screen Reader|
|2.4.4 Link Purpose (In Context)||2.0 & 2.1||X||AQA & verify text & screen reader & verify code|
|2.5.1 Pointer Gestures||2.1||X||Keyboard & Screen Reader & gestures|
|2.5.2 Pointer Cancellation||2.1||X||Keyboard & Screen Reader & mouse events|
|2.5.3 Label in Name||2.1||X||Keyboard & Screen Reader & verify code|
|2.5.4 Motion Actuation||2.1||X||Keyboard & Screen Reader & real motions|
|3.1.1 Language of Page||2.0 & 2.1||X||AQA|
|3.2.1 On Focus||2.0 & 2.1||X||Keyboard & Screen Reader|
|3.2.2 On Input||2.0 & 2.1||X||AQA & keyboard & Screen Reader|
|3.3.1 Error Identification||2.0 & 2.1||X||Keyboard & Screen Reader|
|3.3.2 Labels or Instructions||2.0 & 2.1||X||AQA|
|4.1.1 Parsing||2.0 & 2.1||X||AQA|
|4.1.2 Name, Role, Value||2.0 & 2.1||X||AQA & verify code|
|WCAG 2.0 - TOTAL Level A||25||4||9||12|
|WCAG 2.1 - TOTAL Level A||30||4||9||17|
|Level AA||WCAG||Fully Automatic||Auto & Manual Verification Needed||All Manual|
|1.2.4 Captions (Live)||2.0 & 2.1||X||Keyboard & Screen Reader|
|1.2.5 Audio Description (Prerecorded)||2.0 & 2.1||X||Keyboard & Screen Reader|
|1.3.4 Orientation||2.1||X||Keyboard & Screen Reader|
|1.3.5 Identify Input Purpose||2.1||X||Keyboard & Screen Reader & code verification|
|1.4.3 Contrast (Minimum)||2.0 & 2.1||X||AQA & visual verification & code verification|
|1.4.4 Resize text||2.0 & 2.1||X||AQA & visual verification|
|1.4.5 Images of Text||2.0 & 2.1||X||Visual verification & verify code|
|1.4.10 Reflow||2.1||X||Visual verification|
|1.4.11 Non Text Contrast||2.1||X||Visual verification & code verification|
|1.4.12 Text Spacing||2.1||X||Visual verification|
|1.4.13 Content on Hover or Focus||2.1||X||Keyboard & Screen Reader & mouse events|
|2.4.5 Multiple Ways||2.0 & 2.1||X||Keyboard & Screen Reader|
|2.4.6 Headings and Labels||2.0 & 2.1||X||AQA & verify text|
|2.4.7 Focus Visible||2.0 & 2.1||X||Keyboard & Screen Reader|
|3.1.2 Language of Parts||2.0 & 2.1||X||Keyboard & Screen Reader|
|3.2.3 Consistent Navigation||2.0 & 2.1||X||Visual verification & verify code|
|3.2.4 Consistent Identification||2.0 & 2.1||X||Keyboard & Screen Reader & visual verification & verify code|
|3.3.3 Error Suggestion||2.0 & 2.1||X||Keyboard & Screen Reader|
|3.3.4 Error Prevention (Legal, Financial, Data)||2.0 & 2.1||X||Keyboard & Screen Reader|
|4.1.3 Status Messages||2.1||X||Keyboard & Screen Reader|
|WCAG 2.0 - TOTAL Level AA||13||0||3||10|
|WCAG 2.1 - TOTAL Level AA||20||0||3||17|
|WCAG||Fully Automatic||Auto & Manual Verification Needed||All Manual|
|WCAG 2.0 - TOTAL Level A and AA||38||4||12||22|
|WCAG 2.1 - TOTAL Level A and AA||50||4||12||34|
Source: UsableNet Audit Team 2018. *AQA is UsableNet’s Automated Accessibility Platform.
So how can you test your site fully to ensure compliance?
The short answer, conduct both automated and manual testing against all WCAG 2.0 and 2.1 Success Criteria. However, this can become a costly endeavor when needing to perform tests at a regular cadence as part of multiple releases. Therefore, having an accessibility platform, such as UsableNet AQA, can help to efficiently streamline the entire accessibility testing process involving automated, manual review and user testing.
UsableNet AQA provides automated testing features that can be added to the development process, such as a Chrome extension and API, along with the expert review features that Developers and QA teams can use simultaneously to check all Success Criteria which will need manual inspection and documentation. Additionally, conducting user testing with users of Assistive technology will significantly reduce this manual review work. Unique to the UsableNet AQA platform, a user tester can provide feedback and usability results directly next to the specific code review. UsableNet AQA is fully accessible and all functions of testing, review and user testing input can be performed by all users.
In summary, while Automated accessibility testing is a good and recommended part of an accessibility validation process, they will remain insufficient to achieving and maintaining a fully compliant site. To ensure your site and user flows are accessible and usable to people of a wide range of abilities begin with the automatic test and following that with manual screen reader verification from either experienced accessibility reviewers or daily users of Assistive technology.
Want to learn more about incorporating a user-testing strategy into your accessibility? Register for our webinar "How to set up user testing as part of accessibility program"