Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2851581.2892291acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

SPOCK: A Smooth Pursuit Oculomotor Control Kit

Published: 07 May 2016 Publication History
  • Get Citation Alerts
  • Abstract

    Gaze holds great potential for fast and intuitive hands-free user interaction. However, existing methods typically suffer from the Midas touch problem, i.e. the difficult distinction between gaze for perception and for user action; proposed solutions have required custom-tailored, application-specific user interfaces. Here, we present SPOCK, a novel gaze interaction method based on smooth pursuit eye movements requiring only minimal extensions to button-based interfaces. Upon looking at a UI element, two overlaid dynamic stimuli appear and tracking one of them triggers activation. In contrast to fixations and saccades, smooth pursuits are not only easily performed, but also easily suppressed, thus greatly reducing the Midas touch problem. We evaluated SPOCK against dwell time, the state-of-the-art gaze interaction method, in a simple target selection and a more challenging multiple-choice scenario. At higher task difficulty, unintentional target activations were reduced almost 15-fold by SPOCK, making this a promising method for gaze interaction.

    References

    [1]
    Tuhin Chakraborty, Sayan Sarcar, and Debasis Samanta. 2014. Design and Evaluation of a Dwell-free Eye Typing Technique. In CHI '14 Extended Abstracts on Human Factors in Computing Systems (CHI EA '14). ACM, 1573-1578.
    [2]
    Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7, 4 (2014), 1-11.
    [3]
    Michael Dorr, Martin Böhme, Thomas Martinetz, and Erhardt Barth. 2007. Gaze beats mouse: a case study. In The 3rd Conference on Communication by Gaze Interaction (COGAIN '07). 16-19.
    [4]
    Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (UIST '15). ACM, 457-466.
    [5]
    Jens R. Helmert, Sebastian Pannasch, and Boris M. Velichkovsky. 2008. Influences of dwell time and cursor control on the performance in gaze driven typing. Journal of Eye Movement Research 2, 4 (2008), 1-8.
    [6]
    Anke Huckauf and Mario H. Urbina. 2011. Object selection in gaze controlled systems: What you don't look at is what you get. ACM Transactions on Applied Perception 8, 2 (2011), 13:1-13:14.
    [7]
    Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Transactions on Information Systems 9, 2 (1991), 152-169.
    [8]
    Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze Gestures and Haptic Feedback in Mobile Devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, 435-438.
    [9]
    Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 2 (2015), 1-11.
    [10]
    Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast Gaze Typing with an Adjustable Dwell Time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, 357-360.
    [11]
    Julio C. Mateo, Javier San Agustin, and John Paulin Hansen. 2008. Gaze Beats Mouse: Hands-free Selection by Combining Gaze and EMG. In CHI '08 Extended Abstracts on Human Factors in Computing Systems (CHI EA '08). ACM, 3039-3044.
    [12]
    Emilie Mollenbach, John Paulin Hansen, Martin Lillholm, and Alastair Gale. 2009. Single Stroke Gaze Gestures. In CHI '09 Extended Abstracts on Human Factors in Computing Systems (CHI EA '09). ACM, 4555-4560.
    [13]
    Atsuo Murata, Raku Uetsugi, and Daichi Fukunaga. 2014. Effects of target shape and display location on pointing performance by eye-gaze input system. In Proceedings of the SICE Annual Conference. 955-962.
    [14]
    Abdul Moiz Penkar, Christof Lutteroth, and Gerald Weber. 2012. Designing for the Eye: Design Parameters for Dwell in Gaze Interaction. In Proceedings of the 24th Australian Computer-Human Interaction Conference (OzCHI '12). ACM, 479-488.
    [15]
    Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, 281-288.
    [16]
    Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing (UbiComp '13). ACM, 439-448.
    [17]
    David J. Ward and David J. C. MacKay. 2002. Fast Hands-free Writing by Gaze Direction. Nature 418, 6900 (2002), 838-838.
    [18]
    Steven Yantis. 1996. Attentional capture in vision. Converging operations in the study of visual selective attention (1996), 45-76.

    Cited By

    View all
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2022)EOG-Based Human–Computer Interface: 2000–2020 ReviewSensors10.3390/s2213491422:13(4914)Online publication date: 29-Jun-2022
    • Show More Cited By

    Index Terms

    1. SPOCK: A Smooth Pursuit Oculomotor Control Kit

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI EA '16: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
      May 2016
      3954 pages
      ISBN:9781450340823
      DOI:10.1145/2851581
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2016

      Check for updates

      Author Tags

      1. gaze gestures
      2. gaze tracking
      3. gaze-based interaction
      4. smooth pursuit

      Qualifiers

      • Abstract

      Funding Sources

      • Elite Network Bavaria

      Conference

      CHI'16
      Sponsor:
      CHI'16: CHI Conference on Human Factors in Computing Systems
      May 7 - 12, 2016
      California, San Jose, USA

      Acceptance Rates

      CHI EA '16 Paper Acceptance Rate 1,000 of 5,000 submissions, 20%;
      Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)38
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
      • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
      • (2022)EOG-Based Human–Computer Interface: 2000–2020 ReviewSensors10.3390/s2213491422:13(4914)Online publication date: 29-Jun-2022
      • (2022)Kuiper Belt: Utilizing the “Out-of-natural Angle” Region in the Eye-gaze Interaction for Virtual RealityProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517725(1-17)Online publication date: 29-Apr-2022
      • (2022)Augmented Reality Controlled Smart Wheelchair Using Dynamic Signifiers for Affordance Representation2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS40897.2019.8968290(4812-4818)Online publication date: 28-Dec-2022
      • (2020)Supersaliency: A Novel Pipeline for Predicting Smooth Pursuit-Based Attention Improves Generalisability of Video SaliencyIEEE Access10.1109/ACCESS.2019.29618358(1276-1289)Online publication date: 2020
      • (2018)Contour-guided gaze gesturesProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204530(1-10)Online publication date: 14-Jun-2018
      • (2018)The Pursuing Gaze Beats Mouse in Non-Pop-Out Target Selection2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC.2018.00595(3518-3523)Online publication date: 7-Oct-2018
      • (2017)Toward Everyday Gaze InputProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025599(1118-1130)Online publication date: 2-May-2017
      • (2017)GazeEverywhereProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025455(3034-3044)Online publication date: 2-May-2017
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media