The accessibility industry does not yet have a standard way of calculating the compliance coverage of tools
Just one check against a particular success criteria does not guarantee full coverage of that success criteria. The number of tests that need to be done to meet the WCAG Success Criteria is an arbitrary metric.
As an example, Acquia Optimize does 24 checks for Alt Text Compliance. See the user guide article for the list:
Accessibility Checks.
If we take a look at alt texts on images, the software is able to automatically determine if there is an image with no alt text. What we cannot determine is the correctness of the alt text, for example, if it says that it's a picture of a dog, while it's a picture of a cat. No accessibility checking engines are currently able to determine the accuracy of the alt text. The quality of the checks and the methods for meeting the criteria have not yet been defined by the industry.
Compare for instance the methods used in the car manufacturing industry: Imagine the confusion if every car manufacturer had their own method for measuring carbon dioxide emissions? The resulting readings would tell the consumer absolutely nothing.
What we are able to help with
When manual testing is needed, the issue is flagged by our 'warnings' and 'review' checks notifications.
For warnings, Acquia Optimize highlights an element that we are fairly sure could consist of an accessibility issue. These flagged elements need to be manually reviewed and ignored in the platform if they do not contain an issue, or repaired so that the issue is not flagged again in subsequent scans.
For reviews, Acquia Optimize cannot with certainty say if there is an issue or not, but we help by highlighting elements that should be manually reviewed. For instance, we could flag all videos to be reviewed for captions.
If this content did not answer your questions, try searching or contacting our support team for further assistance.
Tue Oct 22 2024 21:50:45 GMT+0000 (Coordinated Universal Time)