Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3373625.3416994acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article
Open access

Designing and Evaluating Head-based Pointing on Smartphones for People with Motor Impairments

Published: 29 October 2020 Publication History

Abstract

Head-based pointing is an alternative input method for people with motor impairments to access computing devices. This paper proposes a calibration-free head-tracking input mechanism for mobile devices that makes use of the front-facing camera that is standard on most devices. To evaluate our design, we performed two Fitts’ Law studies. First, a comparison study of our method with an existing head-based pointing solution, Eva Facial Mouse, with subjects without motor impairments. Second, we conducted what we believe is the first Fitts’ Law study using a mobile head tracker with subjects with motor impairments. We extend prior studies with a greater range of index of difficulties (IDs) [1.62, 5.20] bits and achieved promising throughput (average 0.61 bps with motor impairments and 0.90 bps without). We found that users’ throughput was 0.95 bps on average in our most difficult task (IDs: 5.20 bits), which involved selecting a target half the size of the Android recommendation for a touch target after moving nearly the full height of the screen. This suggests the system is capable of fine precision tasks. We summarize our observations and the lessons from our user studies into a set of design guidelines for head-based pointing systems.

References

[1]
Mahdieh Abbaszadegan, Sohrab Yaghoubi, and I Scott MacKenzie. 2018. TrackMaze: A Comparison of Head-Tracking, Eye-Tracking, and Tilt as Input Methods for Mobile Games. In International Conference on Human-Computer Interaction. Springer, 393–405.
[2]
Essential Accessibility. 2018. Essential Accessibility. Retrieved August 6, 2018 from https://www.essentialaccessibility.com/assistive-technology-for-android/.
[3]
Richard Bates and Howell O Istance. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices. Universal Access in the Information Society 2, 3 (2003), 280–290. https://doi.org/10.1007/s10209-003-0053-y
[4]
Margrit Betke, James Gips, and Peter Fleming. 2002. The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on neural systems and Rehabilitation Engineering 10, 1(2002), 1–10. https://doi.org/10.1109/TNSRE.2002.1021581
[5]
Martin Bichsel and Alex Pentland. 1993. Automatic interpretation of human head movements.
[6]
Doug Bowman, Ernst Kruijff, Joseph J LaViola Jr, and Ivan P Poupyrev. 2004. 3D User interfaces: theory and practice. Addison Wesley. https://doi.org/10.1162/pres.2005.14.1.117
[7]
Pedro E Bravo, Miriam LeGare, Albert M Cook, and Susan Hussey. 1993. A study of the application of Fitts’ law to selected cerebral palsied adults. Perceptual and motor skills 77, 3_suppl (1993), 1107–1117. https://doi.org/10.2466/pms.1993.77.3f.1107
[8]
Stephen Brewster, Joanna Lumsden, Marek Bell, Malcolm Hall, and Stuart Tasker. 2003. Multimodal ’eyes-free’ interaction techniques for wearable devices. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 473–480. https://doi.org/10.1145/642693.642694
[9]
Muratcan Cicek, Jinrong Xie, Qiaosong Wang, and Robinson Piramuthu. 2018. Mobile Head Tracking for eCommerce and Beyond. arXiv preprint arXiv:1812.07143(2018).
[10]
Rory MS Clifford, Nikita Mae B Tuanquin, and Robert W Lindeman. 2017. Jedi ForceExtension: Telekinesis as a Virtual Reality interaction metaphor. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on. IEEE, 239–240. https://doi.org/10.1109/3DUI.2017.7893360
[11]
Origin Instruments Corporation. 2017. HeadMouse Nano. Retrieved July 17, 2018 from http://www.orin.com/access/headmouse/.
[12]
Justin Cuaresma and I Scott MacKenzie. 2017. FittsFace: Exploring navigation and selection methods for facial tracking. In International Conference on Universal Access in Human-Computer Interaction. Springer, 403–416. https://doi.org/10.1007/978-3-319-58703-5_30
[13]
Gamhewage C De Silva, Michael J Lyons, Shinjiro Kawato, and Nobuji Tetsutani. 2003. Human factors evaluation of a vision-based facial gesture interface. In 2003 Conference on Computer Vision and Pattern Recognition Workshop, Vol. 5. IEEE, 52–52.
[14]
Leah Findlater, Karyn Moffatt, Jon E Froehlich, Meethu Malu, and Joan Zhang. 2017. Comparing touchscreen and mouse input performance by people with and without upper body motor impairments. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 6056–6061. https://doi.org/10.1145/3025453.3025603
[15]
Paul M Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement. Journal of experimental psychology 47, 6 (1954), 381. https://doi.org/10.1037/h0055392
[16]
Yulia Gizatdinova, Oleg Špakov, and Veikko Surakka. 2012. Comparison of video-based pointing and selection techniques for hands-free text entry. In Proceedings of the international working conference on advanced visual interfaces. ACM, 132–139. https://doi.org/10.1145/2254556.2254582
[17]
Glassouse. 2018. Glassouse Assistive Device. Retrieved July 17, 2018 from http://glassouse.com/.
[18]
John Paulin Hansen, Vijay Rajanna, I Scott MacKenzie, and Per Bækgaard. 2018. A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 7.
[19]
Qiong Huang, Ashok Veeraraghavan, and Ashutosh Sabharwal. 2015. TabletGaze: unconstrained appearance-based gaze estimation in mobile tablets. arXiv preprint arXiv:1508.01244(2015). https://doi.org/10.1007/s00138-017-0852-4
[20]
Apple Inc.2018. Use Switch Control to navigate your iPhone, iPad, or iPod touch. Retrieved July 15, 2018 from https://support.apple.com/en-us/ht201370.
[21]
ISO ISO. [n.d.]. 9241-9 Ergonomic requirements for office work with visual display terminals (VDTs)-Part 9: Requirements for non-keyboard input devices (FDIS-Final Draft International Standard), 2000. International Organization for Standardization ([n. d.]).
[22]
Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2176–2184. https://doi.org/10.1109/CVPR.2016.239
[23]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free text entry using gaze paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1952–1956. https://doi.org/10.1145/2858036.2858335
[24]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 81. https://doi.org/10.1145/3173574.3173655
[25]
Perceptive Devices LLC. 2016. SmyleMouse. Retrieved July 15, 2018 from https://smylemouse.com/.
[26]
I Scott MacKenzie. 2015. Fitts’ throughput and the remarkable case of touch-based target selection. In International Conference on Human-Computer Interaction. Springer, 238–249.
[27]
I Scott MacKenzie. 2018. Fitts’ Law. The Wiley Handbook of Human Computer Interaction 1 (2018), 347–370. https://doi.org/10.1002/9781118976005.ch17
[28]
John Magee, Torsten Felzer, and I Scott MacKenzie. 2015. Camera Mouse+ ClickerAID: Dwell vs. single-muscle click actuation in mouse-replacement interfaces. In International Conference on Universal Access in Human-Computer Interaction. Springer, 74–84.
[29]
John J Magee, Samuel Epstein, Eric S Missimer, Christopher Kwan, and Margrit Betke. 2011. Adaptive mouse-replacement interface control functions for users with disabilities. In International Conference on Universal Access in Human-Computer Interaction. Springer, 332–341.
[30]
Päivi Majaranta. 2011. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies: Advances in Assistive Technologies. IGI Global. https://doi.org/10.4018/978-1-61350-098-9
[31]
Cristina Manresa-Yee, Pere Ponsa, Javier Varona, and Francisco J Perales. 2010. User experience to improve the usability of a vision-based interface. Interacting with Computers 22, 6 (2010), 594–605. https://doi.org/10.1016/j.intcom.2010.06.004
[32]
Cesar Mauri. 2017. Enable Viacam. Retrieved July 15, 2018 from http://eviacam.crea-si.com/index.php.
[33]
Cesar Mauri. 2018. EVA Facial Mouse. Retrieved July 16, 2018 from https://github.com/cmauri/eva_facial_mouse#user-content-eva-facial-mouse.
[34]
César Mauri, Toni Granollers i Saltiveri, Jesús Lorés Vidal, and Mabel García. 2006. Computer vision interaction for people with severe movement restrictions. Human Technology: An Interdisciplinary Journal on Humans in ICT Environments, vol. 2, núm. 1, p. 38-54(2006). https://doi.org/10.17011/ht/urn.2006158
[35]
Microsoft. 2018. Eye Control for Windows 10. Retrieved July 16, 2018 from https://www.microsoft.com/en-us/garage/wall-of-fame/eye-control-windows-10/.
[36]
Kyle Montague, Hugo Nicolau, and Vicki L Hanson. 2014. Motor-impaired touchscreen interactions in the wild. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, 123–130. https://doi.org/10.1145/2661334.2661362
[37]
Martez E Mott, Radu-Daniel Vatavu, Shaun K Kane, and Jacob O Wobbrock. 2016. Smart touch: Improving touch accuracy for people with motor impairments with template matching. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1934–1946. https://doi.org/10.1145/2858036.2858390
[38]
MyGaze. 2018. MyGaze Assistive. Retrieved July 16, 2018 from http://www.mygaze.com/products/mygaze-assistive/.
[39]
Trustees of Boston College. 2018. CameraMouse. Retrieved July 15, 2018 from http://www.cameramouse.org/.
[40]
Quha oy.2018. Quha Zono. Retrieved July 15, 2018 from http://www.quha.com/products-2/zono/.
[41]
Ondrej Polacek, Thomas Grill, and Manfred Tscheligi. 2013. NoseTapping: what else can you do with your nose?. In Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia. ACM, 32. https://doi.org/10.1145/2541831.2541867
[42]
Rajeev Ranjan, Shalini De Mello, and Jan Kautz. 2018. Light-weight Head Pose Invariant Gaze Tracking. arXiv preprint arXiv:1804.08572(2018).
[43]
Cameron N Riviere and Nitish V Thakor. 1996. Effects of age and disability on tracking tasks with a computer mouse: Accuracy and linearity. (1996). 8868412.
[44]
Maria Roig-Maimó, Cristina Manresa-Yee, and Javier Varona. 2016. A robust camera-based interface for mobile entertainment. Sensors 16, 2 (2016), 254.
[45]
Maria Francesca Roig-Maimó, I Scott MacKenzie, Cristina Manresa-Yee, and Javier Varona. 2017. Evaluating fitts’ law performance with a non-ISO task. In Proceedings of the XVIII International Conference on Human Computer Interaction. ACM, 5. https://doi.org/10.1145/3123818.3123827
[46]
Maria Francesca Roig-Maimó, I Scott MacKenzie, Cristina Manresa-Yee, and Javier Varona. 2018. Head-tracking interfaces on mobile devices: Evaluation using Fitts’ law and a new multi-directional corner task for small displays. International Journal of Human-Computer Studies 112 (2018), 1–15. https://doi.org/10.1016/j.ijhcs.2017.12.003
[47]
Maria Francesca Roig-Maimó, Cristina Manresa-Yee, Javier Varona, and I Scott MacKenzie. 2016. Evaluation of a mobile head-tracker interface for accessibility. In International Conference on Computers Helping People with Special Needs. Springer, 449–456. https://doi.org/10.1007/978-3-319-41267-2_63
[48]
Jacob O Wobbrock. 2014. Improving pointing in graphical user interfaces for people with motor impairments through ability-based design. In Assistive Technologies and Computer Access for Motor Disabilities. IGI Global, 206–253. https://doi.org/10.4018/978-1-4666-4438-0.ch008
[49]
Xiaoyi Zhang, Harish Kulkarni, and Meredith Ringel Morris. 2017. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2878–2889. https://doi.org/10.1145/3025453.3025790
[50]
Xuan Zhang and I Scott MacKenzie. 2007. Evaluating eye tracking with ISO 9241-part 9. In International Conference on Human-Computer Intrction. Springer, 779–788. https://doi.org/10.1007/978-3-540-73110-8_85
[51]
Rafael Zuniga and John Magee. 2017. Camera Mouse: Dwell vs. Computer Vision-Based Intentional Click Activation. In International Conference on Universal Access in Human-Computer Interaction. Springer, 455–464. https://doi.org/10.1007/978-3-319-58703-5_34

Cited By

View all
  • (2024)A3C: An Image-Association-Based Computing Device Authentication Framework for People with Upper Extremity ImpairmentsACM Transactions on Accessible Computing10.1145/365252217:2(1-37)Online publication date: 19-Mar-2024
  • (2024)Investigating Technology Adoption Soon After Sustaining a Spinal Cord InjuryProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435078:1(1-24)Online publication date: 6-Mar-2024
  • (2024)Towards Personalized Head-Tracking PointingExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650996(1-7)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. Designing and Evaluating Head-based Pointing on Smartphones for People with Motor Impairments
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Conferences
          ASSETS '20: Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility
          October 2020
          764 pages
          ISBN:9781450371032
          DOI:10.1145/3373625
          This work is licensed under a Creative Commons Attribution International 4.0 License.

          Sponsors

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 29 October 2020

          Check for updates

          Author Tags

          1. Accessibility
          2. Head-based Iointing
          3. Input Techniques
          4. Mobile Devices
          5. User-Centered Design

          Qualifiers

          • Research-article
          • Research
          • Refereed limited

          Conference

          ASSETS '20
          Sponsor:

          Acceptance Rates

          ASSETS '20 Paper Acceptance Rate 46 of 167 submissions, 28%;
          Overall Acceptance Rate 436 of 1,556 submissions, 28%

          Upcoming Conference

          ASSETS '25

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)258
          • Downloads (Last 6 weeks)41
          Reflects downloads up to 04 Feb 2025

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)A3C: An Image-Association-Based Computing Device Authentication Framework for People with Upper Extremity ImpairmentsACM Transactions on Accessible Computing10.1145/365252217:2(1-37)Online publication date: 19-Mar-2024
          • (2024)Investigating Technology Adoption Soon After Sustaining a Spinal Cord InjuryProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435078:1(1-24)Online publication date: 6-Mar-2024
          • (2024)Towards Personalized Head-Tracking PointingExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650996(1-7)Online publication date: 11-May-2024
          • (2023)Ability-Centered Examination of People with Motor Impairments’ Interaction with Television Towards More Accessible Smart Home Entertainment EnvironmentsAmbient Intelligence—Software and Applications—13th International Symposium on Ambient Intelligence10.1007/978-3-031-22356-3_4(32-43)Online publication date: 1-Jan-2023
          • (2022)Methodological Standards in Accessibility Research on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 15-Dec-2022
          • (2022)Learning a Head-Tracking Pointing InterfaceComputers Helping People with Special Needs10.1007/978-3-031-08648-9_46(399-406)Online publication date: 11-Jul-2022
          • (2021)“I...Got my Nose-Print. But it Wasn’t Accurate”: How People with Upper Extremity Impairment Authenticate on their Personal Computing DevicesProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445070(1-14)Online publication date: 6-May-2021

          View Options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format.

          HTML Format

          Login options

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media