Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures

Published: 11 August 2017 Publication History

Abstract

In this work, we show how our open source accessibility software, the FaceSwitch, can help motor-impaired subjects to efficiently interact with a computer hands-free. The FaceSwitch enhances gaze interaction with video-based face gestures interaction. The emerging multimodal system allows for interaction with a user interface by means of gaze pointing for target selection and facial gestures for target-specific action commands. The FaceSwitch maps facial gestures to specific mouse or keyboard events such as: left mouse click, right mouse click, or page scroll down. Hence, facial gestures serve the purpose of mechanical switches. With this multimodal interaction paradigm, the user gazes at the object in the user interface with which it wants to interact and then triggers a target-specific action by performing a face gesture. Through a rigorous user study, we have obtained quantitative evidence that suggests our proposed interaction paradigm improves the performance of traditional accessibility options, such as gaze-only interaction or gaze with a single mechanical switch interaction while coming close in terms of speed and accuracy with traditional mouse-based interaction. We make the FaceSwitch software freely available to the community so the output of our research can help the target audience.

References

[1]
Richard Bates and Howell Istance. 2002. Zooming interfaces!: Enhancing the performance of eye controlled pointing devices. In Proceedings of the 5th International ACM Conference on Assistive Technologies (Assets’02). ACM, New York, NY, 119--126.
[2]
R. Bates and H. O. Istance. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices. Univ. Access Info. Soc. 2, 3 (Aug. 2003), 280--290.
[3]
Albert M. Cook and Janice Miller Polgar. 2014. Assistive Technologies: Principles and Practice. Elsevier Health Sciences.
[4]
K. Grauman, M. Betke, J. Lombardi, J. Gips, and G. R. Bradski. 2003. Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. Univ. Access Info. Soc. 2, 4 (Oct. 2003), 359--373.
[5]
Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with objects in the environment by gaze and hand gestures. Lund, Sweden.
[6]
Howell Owen Istance, Christian Spinner, and Peter Alan Howarth. 1996. Providing motor impaired users with access to standard graphical user interface (GUI) software via eye-based interaction. In Proceedings of the 1st European Conference on Disability, Virtual Reality and Associated Technologies (ECDVRAT’96). 109--116.
[7]
J. Sung and D. Kim. 2008. Pose-robust facial expression recognition using view-based 2D + 3D AAM. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans 38, 4 (2008), 852--866.
[8]
T. Ohno. 1998. Features of eye gaze interface for selection tasks. In Proceedings of the 3rd Asia Pacific Computer Human Interaction. 176--181.
[9]
M. Pomplun, N. Ivanovic, E. M. Reingold, and J. Shen. 2001. Empirical evaluation of a novel gaze-controlled zooming interface. In Usability Evaluation and Design: Cognitive Engineering, Intelligent Agents and Virtual Reality. Proceedings of the 9th International Conference on Human-Computer Interaction.
[10]
Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. 2010. ceCursor, A contextual eye cursor for general pointing in windows environments. In Proceedings of the 2010 Symposium on Eye-Tracking Research 8 Applications (ETRA’10). ACM, New York, NY, 331--337.
[11]
Matheus Vieira Portela and David Rozado. 2014. Gaze enhanced speech recognition for truly hands-free and efficient text input during HCI. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design. ACM, 426--429.
[12]
Matthias Rauterberg, Marino Menozzi, and Janet Wesson. 2003. Proceedings of the IFIP TC13 International Conference on Human-Computer Interaction (INTERACT’03), Zurich, Switzerland. IOS Press.
[13]
D. Rozado. 2013. Mouse and keyboard cursor warping to accelerate and reduce the effort of routine HCI input tasks. IEEE Trans. Hum.-Mach. Syst. 43, 5 (Sept. 2013), 487--493.
[14]
David Rozado, Javier S. Agustin, Francisco B. Rodriguez, and Pablo Varona. 2012. Gliding and saccadic gaze gesture recognition in real time. ACM Trans. Interact. Intell. Syst. 1, 2 (Jan. 2012), 10:1--10:27.
[15]
David Rozado, Javier San Agustin, Francisco Rodriguez, and Pablo Varona. 2011. Gliding and saccadic gaze gesture recognition in real time. ACM Trans. Intell. Interact. Syst. (2011).
[16]
Oleg Špakov and Darius Miniotas. 2004. On-line adjustment of dwell time for target selection by gaze. In Proceedings of the 3rd Nordic Conference on Human-computer Interaction (NordiCHI’04). ACM, New York, NY, 203--206.
[17]
S. Vickers. 2011. Eye-gaze interaction techniques for use in online games and environments for users with severe physical disabilities. Doctoral dissertation, De Montfort University.
[18]
Peng Yuan, Xiaorong Gao, Brendan Allison, Yijun Wang, Guangyu Bin, and Shangkai Gao. 2013. A study of the existing problems of estimating the information transfer rate in online brain--computer interfaces. J. Neural Eng. 10, 2 (April 2013), 026014.

Cited By

View all
  • (2024)Accessible human-computer interactionHandbook of Accessible Communication10.57088/978-3-7929-9120-6_25(497-512)Online publication date: 2024
  • (2024)Demonstration of CameraMouseAI: A Head-Based Mouse-Control System for People with Severe Motor DisabilitiesProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3688499(1-6)Online publication date: 27-Oct-2024
  • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
  • Show More Cited By

Index Terms

  1. Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Accessible Computing
    ACM Transactions on Accessible Computing  Volume 10, Issue 3
    August 2017
    76 pages
    ISSN:1936-7228
    EISSN:1936-7236
    DOI:10.1145/3132048
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 August 2017
    Accepted: 01 March 2017
    Revised: 01 January 2017
    Received: 01 September 2015
    Published in TACCESS Volume 10, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaze interaction
    2. accessibility
    3. accessible computing
    4. assistive technologies
    5. face tracking
    6. human factors
    7. human factors and ergonomics
    8. human performance
    9. human-computer interaction

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    • Commonwealth Scientific and Industrial Research Organization-CSIRO

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)52
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 10 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Accessible human-computer interactionHandbook of Accessible Communication10.57088/978-3-7929-9120-6_25(497-512)Online publication date: 2024
    • (2024)Demonstration of CameraMouseAI: A Head-Based Mouse-Control System for People with Severe Motor DisabilitiesProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3688499(1-6)Online publication date: 27-Oct-2024
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2024)Optimization Algorithm for Intelligent Cockpit Human Computer Interaction Interface Design Based on Neural Network Model2024 International Conference on Electrical Drives, Power Electronics & Engineering (EDPEE)10.1109/EDPEE61724.2024.00142(738-742)Online publication date: 27-Feb-2024
    • (2024)Gaze analysisImage and Vision Computing10.1016/j.imavis.2024.104961144:COnline publication date: 1-Apr-2024
    • (2024)Application of intelligent internet of things and interaction design in Museum TourHeliyon10.1016/j.heliyon.2024.e3586610:16(e35866)Online publication date: Aug-2024
    • (2024)Allowing for Secure and Accessible Authentication for Individuals with Disabilities of DexterityHuman-Centered Software Engineering10.1007/978-3-031-64576-1_7(133-146)Online publication date: 8-Jul-2024
    • (2024)Personalized Facial Gesture Recognition for Accessible Mobile GamingComputers Helping People with Special Needs10.1007/978-3-031-62846-7_15(120-127)Online publication date: 5-Jul-2024
    • (2024)Eye-Gaze-Based Intention Recognition for Selection Task by Using SVM-RFHuman-Computer Interaction10.1007/978-3-031-60449-2_11(157-168)Online publication date: 29-Jun-2024
    • (2023)Task-Technology Fit and ICT Use in Remote Work Practice During the COVID-19 PandemicJournal of Global Information Management10.4018/JGIM.32409731:1(1-24)Online publication date: 8-Jun-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media