Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3493612.3520456acmconferencesArticle/Chapter ViewAbstractPublication Pagesw4aConference Proceedingsconference-collections
research-article
Open access

For one or for all?: survey of educator perceptions of web speech-based auditory description in science interactives

Published: 27 April 2022 Publication History

Abstract

The evolution of Web Speech has increased the ease of development and public availability of auditory description without the use of screen reader software, broadening its exposure to users who may benefit from spoken descriptions. Building off an existing design framework for auditory description of interactive web media, we have designed an optional Voicing feature instantiated in two PhET Interactive Simulations regularly used by students and educators globally. We surveyed over 2000 educators to investigate their perceptions and preferences of the Web Speech-based Voicing feature and its broad appeal and effectiveness for teaching and learning. We find a general approval by educators of the Voicing feature and more moderate statement ratings than expected to the different preset speech levels we presented to them. We find that educators perceive the feature as beneficial both broadly and for specific populations while some acknowledge particular populations for whom it remains ineffective. Lastly, we identify some variance in the perceptions of the feature based on different aspects of the simulation experience.

References

[1]
DIAGRAM Center. 2015. Image Description Guidelines. http://diagramcenter.org/table-of-contents-2.html
[2]
DIAGRAM Center. 2022. Poet Image Description. https://poet.diagramcenter.org/index.html
[3]
World Wide Web Consortium. 2022. Web Content Accessibility Guidelines (WCAG) Overview. https://www.w3.org/WAI/standards-guidelines/wcag/
[4]
Team R Core. 2022. R Foundation for Statistical Computing. https://www.r-project.org/
[5]
Richard Ely, Robert Wall Emerson, Theresa Maggiore, Madeleine Rothberg, Trisha O'Connell, and Laurel Hudson. 2006. Increased Content Knowledge of Students with Visual Impairments as a Result of Extended Descriptions. Journal of Special Education Technology 21 (2006). Issue 3.
[6]
Brett L. Fiedler, Emily B. Moore, Tiara Sawyer, and Bruce N. Walker. 2021. Multimodality and inclusion: Educator perceptions of physics simulation auditory display. 2021 Physics Education Research Conference Proceedings, 123--128.
[7]
Brett L. Fiedler, Taliesin L. Smith, and Emily B. Moore. 2022. Surveys of Auditory Display in PhETInteractive Simulations (2020-2021).
[8]
Brett L. Fiedler, Bruce N. Walker, and Emily B. Moore. 2021. To sonify or not to sonify?: Educator perceptions of auditory display in interactive simulations. 2021 International Conference on Auditory Display Conference Proceedings.
[9]
Andrew Holland. 2008. Audio description in the theatre and the visual arts: Images into words.
[10]
Steve Jacobs. 2002. The electronic curbcut effect. http://www.icdri.org/technology/ecceff.htm
[11]
Mina C. Johnson-Glenberg, Caroline Savio-Ramos, Katherine K. Perkins, Emily B. Moore, Robb Lindgren, Douglas Clark, Corey Brady, Pratim Sengupta, Mario Martinez-Garza, Deanne Adams, Stephen Killingsworth, Grant Van Eaton, Matthew Gaydos, Amanda Barany, Kurt Squire, and Nathan Holbert. 2014. Science sims and games: Best design practices and fave flops. Proceedings of International Conference of the Learning Sciences, ICLS 3. Issue January.
[12]
Kyle Keane and Christina Laverentz. 2014. Interactive scientific graphics: Recommended practices for verbal description. (2014).
[13]
Alexander Mathews, Lexing Xie, and Xuming He. 2016. SentiCap: Generating image descriptions with sentiments. 30th AAAI Conference on Artificial Intelligence, AAAI 2016.
[14]
Emily B. Moore, Timothy A. Herzog, and Katherine K. Perkins. 2013. Interactive simulations as implicit support for guided-inquiry. Chem. Educ. Res. Pract. 14 (2013), 257--268. Issue 3.
[15]
Emily B. Moore and Clayton Lewis. 2015. Opportunity: Inclusive design for interactive simulations. ASSETS 2015 - Proceedings of the 17th International ACM SIGACCESS Conference on Computers and Accessibility.
[16]
Emily B. Moore, Taliesin L. Smith, and Jesse Greenberg. 2018. Keyboard and screen reader accessibility in complex interactive science simulations: Design challenges and elegant solutions. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 10907 LNCS.
[17]
Mozilla. 2022. Web Audio API. https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
[18]
Mozilla. 2022. Web Speech API. https://developer.mozilla.org/en-US/docs/Web/API/Web_Speech_API
[19]
Jaclyn Packer, Katie Vizenor, and Joshua A. Miele. 2015. An overview of video description: History, benefits, and guidelines. Journal of Visual Impairment and Blindness 109 (2015). Issue 2.
[20]
PhET Interactive Simulations. 2022. PhET Simulations with Inclusive Features. https://phet.colorado.edu/en/simulations/filter?a11yFeatures=accessibility&sort=alpha&view=grid
[21]
Shivam Singh, Anurag Bhandari, and Nishith Pathak. 2018. Accessify: An ML powered application to provide accessible images on web sites. Proceedings of the 15th Web for All Conference : Internet of Accessible Things, W4A 2018.
[22]
Taliesin Smith. 2016. Access, Action, & Agency: Inclusive Design for the Non-visual Use of a Highly Interactive Simulation. http://openresearch.ocadu.ca/id/eprint/713
[23]
Taliesin L. Smith, Jesse Greenberg, Sam Reid, and Emily B. Moore. 2018. Parallel DOM architecture for accessible interactive simulations. Proceedings of the 15th Web for All Conference : Internet of Accessible Things, W4A 2018.
[24]
Taliesin L. Smith, Clayton Lewis, and Emily B. Moore. 2016. A balloon, a sweater, and a wall: Developing design strategies for accessible user experiences with a science simulation. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9739.
[25]
Taliesin L. Smith, Clayton Lewis, and Emily B. Moore. 2016. Demonstration: Screen reader support for a complex interactive science simulation. ASSETS 2016 - Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility.
[26]
Taliesin L Smith, Clayton Lewis, and Emily B Moore. 2017. Description strategies to make an interactive science simulation accessible. Journal on Technology and Persons with Disabilities (2017), 225--238. http://scholarworks.csun.edu/handle/10211.3/190214
[27]
Taliesin L. Smith and Emily B. Moore. 2020. Storytelling to Sensemaking: A Systematic Framework for Designing Auditory Description Display for Interactives. Conference on Human Factors in Computing Systems - Proceedings.
[28]
Volker Sorge, Mark Lee, and Sandy Wilkinson. 2015. End-to-end solution for accessible chemical diagrams. W4A 2015 - 12th Web for All Conference.
[29]
Abigale Stangl, Nitin Verma, Kenneth R. Fleischmann, Meredith Ringel Morris, and Danna Gurari. 2021. Going beyond One-Size-Fits-All Image Descriptions to Satisfy the Information Wants of People Who are Blind or Have Low Vision. ASSETS 2021 - 23rd International ACM SIGACCESS Conference on Computers and Accessibility.
[30]
Jennifer L Tennison, Jesse Greenberg, Emily B Moore, and Jenna L Gorlewicz. 2021. Haptic Paradigms for Multimodal Interactive Simulations. The Journal on Technology and Persons with Disabilities (2021), 110.
[31]
Brianna Tomlinson, Prakriti Kaini, Bruce Walker, Jared Batterman, and Emily Moore. 2018. Supporting Simulation Use for Students with Intellectual and Developmental Disabilities. Journal on Technology and Persons with Disabilities 6 (2018).
[32]
Brianna J. Tomlinson, Prakriti Kaini, Siyan Zhou, Taliesin L. Smith, Emily B. Moore, and Bruce N. Walker. 2018. Design and evaluation of a multimodal science simulation. ASSETS 2018 - Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility.
[33]
Brianna J. Tomlinson, Brittany E. Noah, and Bruce N. Walker. 2018. BUZZ: An auditory interface user experience scale. Conference on Human Factors in Computing Systems - Proceedings 2018-April.
[34]
Brianna J. Tomlinson, Bruce N. Walker, and Emily B. Moore. 2020. Auditory Display in Interactive Science Simulations: Description and Sonification Support Interaction and Enhance Opportunities for Learning. Conference on Human Factors in Computing Systems - Proceedings (2020), 1--12.
[35]
Shaomei Wu, Jeffrey Wieland, Omid Farivar, and Julie Schiller. 2017. Automatic alt-text: Computer-generated image descriptions for blind users on a social network service. Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW.

Cited By

View all
  • (2023)Investigating Sensory Extensions as Input for Interactive SimulationsProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3573108(1-7)Online publication date: 26-Feb-2023

Index Terms

  1. For one or for all?: survey of educator perceptions of web speech-based auditory description in science interactives

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      W4A '22: Proceedings of the 19th International Web for All Conference
      April 2022
      209 pages
      ISBN:9781450391702
      DOI:10.1145/3493612
      This work is licensed under a Creative Commons Attribution-NonCommercial International 4.0 License.

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 27 April 2022

      Check for updates

      Author Tags

      1. auditory description
      2. educators
      3. interactives
      4. surveys
      5. web speech

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      W4A'22
      W4A'22: 19th Web for All Conference
      April 25 - 26, 2022
      Lyon, France

      Acceptance Rates

      W4A '22 Paper Acceptance Rate 18 of 36 submissions, 50%;
      Overall Acceptance Rate 171 of 371 submissions, 46%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)76
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 10 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Investigating Sensory Extensions as Input for Interactive SimulationsProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3573108(1-7)Online publication date: 26-Feb-2023

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media