A screenshot of 'the world's least accessible website'
The British team built a stand-alone webpage full of accessibility fails so that they could conduct an audit of 10 popular automated testing tools to see how accurate they were. All up they created 143 fails, grouped into 19 categories, and you can check out the scope and contents of this inaccessible webpage.
As Mehmet Duran explained in a blogpost about the inaccessible website test in February 2017, “the fails include things like images without alt attributes, or with the wrong alt attributes, and blank link text. We also put in a number of things that we thought testing tools probably wouldn’t be able to detect, but are also accessibility issues. Things like flashing content that didn’t carry a warning, or plain language not being used in content.”
“We knew there was no way we could put in every potential accessibility barrier,” added Mehmet, “but we wanted to have enough on the page so that we could adequately test how useful the tools were. We then ran the tools against the page, to find out how many of the fails they would pick up and how many they would miss.”
The findings of the inaccessible webpage test support what specialists involved in web accessibility already know from field experience, which is that automated tools are useful but certainly not infallible.
As Mehmet Duran said in his blogpost, “no tool will be able to pick up every accessibility barrier on a website. So just because a tool hasn’t picked up any accessibility issues on a website, doesn’t mean those issues don’t exist. And even if they do detect a barrier, sometimes the results they give will be inconclusive or require further investigation. Or even just wrong.”
Results between the 10 tools varied widely, with Google Developer Tools (which is quite a popular tool) faring the worst by only picking up 17% of the barriers. Counting only error messages and warnings, Tenon picked up the most barriers, finding 37% of them. The highest score, if you also count manual inspection prompts, was achieved by the Asqatasun webpage analyser which found 41% of the barriers.
This of course means that even the top-of-the-class tester did not find 59% of the fails. What’s more, 29% of the barriers the team created were not picked up by any of the tools tested. Of the 143 barriers created, a total of 42 were missed by every single one of them.
This test-case certainly supports the need for manual inspection, along with automated tool testing, in order to fully realise web accessibility. If your organisation needs assistance with website accessibility testing or other web and digital access services, you can contact Media Access Australia’s digital accessibility services team.
Top of page