Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3493612.3520454acmconferencesArticle/Chapter ViewAbstractPublication Pagesw4aConference Proceedingsconference-collections
research-article

Understanding the touchscreen-based nonvisual target acquisition task performance of screen reader users

Published: 27 April 2022 Publication History
  • Get Citation Alerts
  • Abstract

    Understanding the users' performance for finding and selecting a target is important for designing an efficient user interface. However, little has been studied about the performance of screen reader users whose primary sense is audio. To better support touchscreen-based interaction for screen reader users, we conducted a user study on a smartphone with 12 participants with visual impairments where they were asked to perform a series of target acquisition tasks on a smartphone with screen reader on varying the screen size and the screen-target ratio. As a result, we found that the participants were faster at finding targets with shorter traces when the screen size is smaller with larger target size in general. However, we also found that the ratio of the target size concerning the screen size affects task efficiency. In addition, we examined traces of touch events and identified five screen exploration strategies: zigzag, border-first, pigtail, hybrid, and other. Based on the findings, we suggest implications for designing an efficient touchscreen-based user interface for screen reader users.

    References

    [1]
    [n.d.]. WebAIM: Screen Reader User Survey 8 Results. https://webaim.org/projects/screenreadersurvey8/#mobilescreenreaders. Accessed: 2021-04-13.
    [2]
    Dragan Ahmetovic, Nahyun Kwon, Uran Oh, Cristian Bernareggi, and Sergio Mascetti. 2021. Touch Screen Exploration of Visual Artwork for Blind People. In The Web Conference (WWW). ACM.
    [3]
    Rei Asakura and Tetsuya Watanabe. 2017. Touchscreen device size suitable for icon search by blind users. In Proceedings of the 11th International Convention on Rehabilitation Engineering and Assistive Technology. 1--4.
    [4]
    Xiaojun Bi, Yang Li, and Shumin Zhai. 2013. FFitts law: modeling finger touch with fitts' law. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1363--1372.
    [5]
    James Boritz and Williarn B Cowan. 1991. Fitts's law studies of directional mouse movement. human performance 1 (1991), 6.
    [6]
    Stuart K Card, William K English, and Betty J Burr. 1978. Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics 21, 8 (1978), 601--613.
    [7]
    Andy Cockburn, David Ahlström, and Carl Gutwin. 2012. Understanding performance in touch selections: Tap, drag and radial pointing drag with finger, stylus and mouse. International Journal of Human-Computer Studies 70, 3 (2012), 218--233.
    [8]
    Leah Findlater, Jon E Froehlich, Kays Fattal, Jacob O Wobbrock, and Tanya Dastyar. 2013. Age-related differences in performance with touchscreens compared to traditional mouse input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 343--346.
    [9]
    Paul M Fitts and Barbara K Radford. 1966. Information capacity of discrete motor responses under different cognitive sets. Journal of Experimental psychology 71, 4 (1966), 475.
    [10]
    Tiago Guerreiro, Joaquim Jorge, and Daniel Gonçalves. 2012. Exploring the non-visual acquisition of targets on touch phones and tablets. In 2nd Workshop on Mobile Accessibility, MobileHCI.
    [11]
    Tiago Guerreiro, Kyle Montague, João Guerreiro, Rafael Nunes, Hugo Nicolau, and Daniel JV Gonçalves. 2015. Blind people interacting with large touch surfaces: Strategies for one-handed and two-handed exploration. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces. 25--34.
    [12]
    Shaun K Kane, Jeffrey P Bigham, and Jacob O Wobbrock. 2008. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. 73--80.
    [13]
    Seungyon Lee and Shumin Zhai. 2009. The performance of touch screen soft buttons. In Proceedings of the SIGCHI conference on human factors in computing systems. 309--318.
    [14]
    Lei Liu, Robert van Liere, Catharina Nieuwenhuizen, and Jean-Bernard Martens. 2009. Comparing aimed movements in the real world and in virtual reality. In 2009 IEEE Virtual Reality Conference. IEEE, 219--222.
    [15]
    I Scott MacKenzie. 1992. Fitts' law as a research and design tool in human-computer interaction. Human-computer interaction 7, 1 (1992), 91--139.
    [16]
    I Scott MacKenzie and William Buxton. 1992. Extending Fitts' law to two-dimensional tasks. In Proceedings of the SIGCHI conference on Human factors in computing systems. 219--226.
    [17]
    Microsoft. n. d. Seeing AI. https://www.microsoft.com/en-us/ai/seeing-ai. Accessed: 2019-10-14.
    [18]
    Meredith Ringel Morris, Jazette Johnson, Cynthia L Bennett, and Edward Cutrell. 2018. Rich representations of visual content for screen reader users. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 59.
    [19]
    Atsuo Murata and Hirokazu Iwase. 2001. Extending Fitts' law to a three-dimensional pointing task. Human movement science 20, 6 (2001), 791--805.
    [20]
    Uran Oh Nahyun Kwon, Youngji Koh. 2019. Supporting Object-level Exploration of Artworks by Touch for People With Visual Impairments. In ASSETS 19. ACM.
    [21]
    Uran Oh and Leah Findlater. 2015. A Performance Comparison of On-Hand versus On-Phone Nonvisual Input by Blind and Sighted Users. ACM Trans. Access. Comput. 7, 4, Article 14 (Nov. 2015), 20 pages.
    [22]
    Pekka Parhi, Amy K Karlson, and Benjamin B Bederson. 2006. Target size study for one-handed thumb use on small touchscreen devices. In Proceedings of the 8th conference on Human-computer interaction with mobile devices and services. 203--210.
    [23]
    Kyle Reinholt, Darren Guinness, and Shaun K Kane. 2019. EyeDescribe: Combining Eye Gaze and Speech to Automatically Create Accessible Touch Screen Artwork. In Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces. 101--112.
    [24]
    André Rodrigues, Hugo Nicolau, Kyle Montague, Luís Carriço, and Tiago Guerreiro. 2016. Effect of target size on non-visual text-entry. In Proceedings of the 18th International conference on human-computer interaction with mobile devices and services. 47--52.
    [25]
    Farzan Sasangohar, I Scott MacKenzie, and Stacey D Scott. 2009. Evaluation of mouse and touch input for a tabletop display using Fitts' reciprocal tapping task. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 53. SAGE Publications Sage CA: Los Angeles, CA, 839--843.
    [26]
    R William Soukoreff and I Scott MacKenzie. 2004. Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI. International journal of human-computer studies 61, 6 (2004), 751--789.
    [27]
    Thomas G Whisenand and Henry H Emurian. 1999. Analysis of cursor movements with a mouse. Computers in Human Behavior 15, 1 (1999), 85--103.
    [28]
    Shota Yamanaka. 2018. Risk Effects of Surrounding Distractors Imposing Time Penalty in Touch-Pointing Tasks. In Proceedings of the 2018 ACM International Conference on Interactive Surfaces and Spaces. 129--135.
    [29]
    Ryo Yamazaki and Tetsuya Watanabe. 2017. Effects of Touchscreen Device Size on Non-Visual Icon Search. IEICE TRANSACTIONS on Information and Systems 100, 12 (2017), 3050--3053.
    [30]
    Eric Zavesky, Shih-Fu Chang, and Cheng-Chih Yang. 2008. Visual islands: intuitive browsing of visual search results. In Proceedings of the 2008 international conference on Content-based image and video retrieval. 617--626.

    Cited By

    View all

    Index Terms

    1. Understanding the touchscreen-based nonvisual target acquisition task performance of screen reader users

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        W4A '22: Proceedings of the 19th International Web for All Conference
        April 2022
        209 pages
        ISBN:9781450391702
        DOI:10.1145/3493612
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        In-Cooperation

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 27 April 2022

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. blindness
        2. screen reader
        3. target acquisition task
        4. touchscreen
        5. visual impairments

        Qualifiers

        • Research-article

        Funding Sources

        • National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT)

        Conference

        W4A'22
        W4A'22: 19th Web for All Conference
        April 25 - 26, 2022
        Lyon, France

        Acceptance Rates

        W4A '22 Paper Acceptance Rate 18 of 36 submissions, 50%;
        Overall Acceptance Rate 171 of 371 submissions, 46%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 97
          Total Downloads
        • Downloads (Last 12 months)26
        • Downloads (Last 6 weeks)5
        Reflects downloads up to 10 Aug 2024

        Other Metrics

        Citations

        Cited By

        View all

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media