Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3430263.3452436acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesw4aConference Proceedingsconference-collections
short-paper

The transparency of automatic accessibility evaluation tools

Published: 20 May 2021 Publication History

Abstract

Several web accessibility evaluation tools have been put forward to reduce the burden of identifying accessibility barriers for disabled people. One common issue in using accessibility evaluation tools in practice is that the results provided by different tools are often variable. Such variability may confuse the users who may not understand the reasons behind it, and thus limits the possible adoption of such tools. Hence, there is a need to shed light on the tools' actual functioning, indicate what criteria they should adopt to be transparent and to help users better interpret their results. In this communication paper, we discuss such issues, analyse how they have been addressed by a representative set of tools, and provide useful indications for obtaining user-centred accessibility evaluations.

References

[1]
Siddikjon Gaibullojonovich Abduganiev. 2017. Towards automated web accessibility evaluation: a comparative study. Int. J. Inf. Technol. Comput. Sci.(IJITCS) 9, 9 (2017), 18--44.
[2]
Shadi Abou-Zahra. 2017. Evaluation and Report Language (EARL). Retrieved February 2, 2017 from https://www.w3.org/TR/EARL10-Schema/#OutcomeValue
[3]
Giorgio Brajnik. 2004. Comparing accessibility evaluation tools: a method for tool effectiveness. Universal access in the information society 3, 3--4 (2004), 252--263.
[4]
Giovanna Broccia, Marco Manca, Fabio Paternò, and Francesca Pulina. 2020. Flexible automatic support for web accessibility validation. Proceedings of the ACM on Human-Computer Interaction 4, EICS (2020), 1--24.
[5]
Andreas Burkard, Gottfried Zimmermann, and Bettina Schwarzer. 2021. Monitoring Systems for Checking Websites on Accessibility. Frontiers in Computer Science 3 (2021), 2.
[6]
Greg Gay and Cindy Qi Li. 2010. AChecker: open, interactive, customizable, web accessibility checking. In Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A). 1--2.
[7]
Melody Y Ivory, Jennifer Mankoff, and Audrey Le. 2003. Using automated tools to improve web site usage by users with diverse abilities. Human-Computer Interaction Institute (2003), 117.
[8]
Leonard R Kasday. 2000. A tool to evaluate universal Web accessibility. In Proceedings on the 2000 conference on Universal Usability. 161--162.
[9]
Ashli M Molinero, Frederick G Kohun, and R Morris. 2006. Reliability in Automated Evaluation Tools for Web Accessibility Standards Compliance. issues in Information Systems 7, 2 (2006), 218--222.
[10]
Marian Pădure and Costin Pribeanu. 2019. Exploring the differences between five accessibility evaluation tools. (2019).
[11]
Markel Vigo, Justin Brown, and Vivienne Conway. 2013. Benchmarking web accessibility evaluation tools: measuring the harm of sole reliance on automated tests. In Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility. 1--10.

Cited By

View all
  • (2023)Starting well on design for accessibility: analysis of W3C's 167 accessibility evaluation tools for the design phaseProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3614474(1-7)Online publication date: 22-Oct-2023
  • (2023)The Transparency of Automatic Web Accessibility Evaluation Tools: Design Criteria, State of the Art, and User PerceptionACM Transactions on Accessible Computing10.1145/355697916:1(1-36)Online publication date: 28-Mar-2023
  • (2022)Accessible Design is Mediated by Job Support Structures and Knowledge Gained Through Design Career PathwaysProceedings of the ACM on Human-Computer Interaction10.1145/35555886:CSCW2(1-24)Online publication date: 11-Nov-2022
  • Show More Cited By

Index Terms

  1. The transparency of automatic accessibility evaluation tools

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    W4A '21: Proceedings of the 18th International Web for All Conference
    April 2021
    224 pages
    ISBN:9781450382120
    DOI:10.1145/3430263
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 May 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. accessibility
    2. automatic validation tools
    3. transparency

    Qualifiers

    • Short-paper

    Conference

    W4A '21
    W4A '21: 18th Web for All Conference
    April 19 - 20, 2021
    Ljubljana, Slovenia

    Acceptance Rates

    Overall Acceptance Rate 171 of 371 submissions, 46%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)57
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 15 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Starting well on design for accessibility: analysis of W3C's 167 accessibility evaluation tools for the design phaseProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3614474(1-7)Online publication date: 22-Oct-2023
    • (2023)The Transparency of Automatic Web Accessibility Evaluation Tools: Design Criteria, State of the Art, and User PerceptionACM Transactions on Accessible Computing10.1145/355697916:1(1-36)Online publication date: 28-Mar-2023
    • (2022)Accessible Design is Mediated by Job Support Structures and Knowledge Gained Through Design Career PathwaysProceedings of the ACM on Human-Computer Interaction10.1145/35555886:CSCW2(1-24)Online publication date: 11-Nov-2022
    • (2022)Evaluation methods in legal procedures concerning digital accessibility in BrazilProceedings of the 21st Brazilian Symposium on Human Factors in Computing Systems10.1145/3554364.3559130(1-12)Online publication date: 17-Oct-2022

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media