Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3098279.3098561acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

Designing a gaze gesture guiding system

Published: 04 September 2017 Publication History
  • Get Citation Alerts
  • Abstract

    We propose the concept of a guiding system specifically designed for semaphoric gaze gestures, i.e. gestures defining a vocabulary to trigger commands via the gaze modality. Our design exploration considers fundamental gaze gesture phases: Exploration, Guidance, and Return. A first experiment reveals that Guidance with dynamic elements moving along 2D paths is efficient and resistant to visual complexity. A second experiment reveals that a Rapid Serial Visual Presentation of command names during Exploration allows for more than 30% faster command retrievals than a standard visual search. To resume the task where the guide was triggered, labels moving from the outward extremity of 2D paths toward the guide center leads to efficient and accurate origin retrieval during the Return phase. We evaluate our resulting Gaze Gesture Guiding system, G3, for interacting with distant objects in an office environment using a head-mounted display. Users report positively on their experience with both semaphoric gaze gestures and G3.

    References

    [1]
    Roland Aigner, Daniel Wigdor, Hrvoje Benko, et al. 2012. Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI. Retrieved March 18, 2014 from http://131.107.65.14/en-us/um/people/benko/publications/2012/Understanding MidAir Gestures - MSR-TR-2012-111.pdf
    [2]
    Fraser Anderson and Walter F Bischof. 2013. Learning and performance with gesture guides. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '13, 1109--1118.
    [3]
    Fraser Anderson, Tovi Grossman, Justin Matejka, and George Fitzmaurice. 2013. YouMove: Enhancing Movement Training with an Augmented Reality Mirror. In Proceedings of the 26th annual ACM symposium on User interface software and technology - UIST '13, 311--320.
    [4]
    Mihai Bâce, Teemu Leppänen, David Gil de Gomez, and Argenis Ramirez Gomez. 2016. ubiGaze: ubiquitous augmented reality messaging using gaze gestures. In SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications - SA '16, 1--5.
    [5]
    Olivier Bau and Wendy E. Mackay. 2008. OctoPocus: A Dynamic Guide for Learning Gesture-Based Command Sets. In Proceedings of the 21st annual ACM symposium on User interface software and technology - UIST '08, 37--46.
    [6]
    Nikolaus Bee and Elisabeth André. 2008. Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5078 LNCS: 111--122.
    [7]
    Andrew Bragdon, Robert Zeleznik, Brian Williamson, Timothy Miller, and Joseph J. LaViola. 2009. GestureBar: Improving the Approachability of Gesture-based Interfaces. In Proceedings of the 27th international conference on Human factors in computing systems - CHI '09, 2269--2278.
    [8]
    Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16, 3415--3427.
    [9]
    Andrea Colaço, Ahmed Kirmani, Hye Soo Yang, Nan-Wei Gong, Chris Schmandt, and Vivek K. Goyal. 2013. Mime: Compact, Low-Power 3D Gesture Sensing for Interaction with Head-Mounted Displays. In Proceedings of the 26th annual ACM symposium on User interface software and technology - UIST '13, 227--236.
    [10]
    Dh Cymek, Ac Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Rötting. 2014. Entering PIN Codes by Smooth Pursuit Eye Movements. Journal of Eye Movement Research 7, 4: 1--11.
    [11]
    Edwin S. Dalmaijer. 2014. Is the low-cost EyeTribe eye tracker any good for research? Peer J PrePrints 4, 606901: 1--35.
    [12]
    William Delamare, Céline Coutrix, and Laurence Nigay. 2015. Designing guiding systems for gesture-based interaction. In Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems - EICS '15, 44--53.
    [13]
    William Delamare, Thomas Janssoone, Céline Coutrix, and Laurence Nigay. 2016. Designing 3D Gesture Guidance: Visual Feedback and Feedforward Design Options. In Proceedings of the International Working Conference on Advanced Visual Interfaces - AVI '16, 152--159.
    [14]
    Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Proc. of the Int. Conf. on Human-computer interaction '07. 475--488.
    [15]
    Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. 2012. Gaze input for mobile devices by dwell and gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12, 225--228.
    [16]
    Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15, 1: 457--466.
    [17]
    Dustin Freeman, Hrvoje Benko, Meredith Ringel Morris, and Daniel Wigdor. 2009. ShadowGuides: Visualizations for In-Situ Learning of Multi-Touch and Whole-Hand Gestures. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces - ITS '09, 165--172.
    [18]
    Emilien Ghomi, Stéphane Huot, Olivier Bau, Michel Beaudouin-Lafon, and Wendy E. Mackay. 2013. Arpège: Learning Multitouch Chord Gestures Vocabularies. In Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces - ITS '13, 209--218.
    [19]
    John Paulin Hansen, Florian Biermann, Janus Askø Madsen, et al. 2015. A gaze interactive textual smartwatch interface. Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers - UbiComp '15: 839--847.
    [20]
    John Paulin Hansen, Haakon Lund, Florian Biermann, Emillie Møllenbach, Sebastian Sztuk, and Javier San Agustin. 2016. Wrist-worn pervasive gaze interaction. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA '16, 57--64.
    [21]
    Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance: A Metric to Quantify Arm Fatigue of Mid-Air Interactions. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI '14, 1063--1072.
    [22]
    Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12, 229--232.
    [23]
    Poika Isokoski. 2000. Text input methods for eye trackers using off-screen targets. Proceedings of the symposium on Eye tracking research applications ETRA 00: 15--21.
    [24]
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. 2010. Designing Gaze Gestures for Gaming: An Investigation of Performance. In Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications - ETRA '10, 323--330.
    [25]
    RJK Jacob and Keith S Karn. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. 4.
    [26]
    Robert J K Jacob. 1993. Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces. Advances in human-computer interaction 4: 151--190.
    [27]
    James F Juola, Nicklas J Ward, and Timothy McNamara. 1982. Visual Search and Reading of Rapid Serial Presentations of Letter Strings, Words, and Text. Journal of Experimental Psychology: General 111, 2: 208--227.
    [28]
    Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze gestures and haptic feedback in mobile devices. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI '14, 435--438.
    [29]
    Maria Karam and m. c. Schraefel. 2005. A Taxonomy of Gestures in Human Computer Interactions. 1--45.
    [30]
    G. Kurtenbach, T. P. Moran, and W. Buxton. 1994. Contextual Animation of Gestural Commands. Computer Graphics Forum 13, 5: 305--314.
    [31]
    Gordon Kurtenbach, Abigail Sellen, and William Buxton. 1993. An Empirical Evaluation of Some Articulatory and Cognitive Aspects of Marking Menus. Human-Computer Interaction 8, 1--23.
    [32]
    Microsoft. Hololens. Retrieved December 8, 2016 from https://www.microsoft.com/microsoft-hololens/fr-ca
    [33]
    Emilie Møllenbach, John Paulin Hansen, and Martin Lillholm. 2013. Eye Movements in Gaze Interaction. Journal of Eye Movement Research 6: 1--15.
    [34]
    Emilie Møllenbach, Martin Lillholm, Alastair Gail, and John Paulin Hansen. 2010. Single gaze gestures. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications - ETRA '10, 2: 177.
    [35]
    Gustav Öquist, Anna Sågvall Hein, Jan Ygge, and Mikael Goldstein. 2004. Eye Movement Study of Reading on a Mobile Device Using the Page and RSVP Text Presentation Formats. In Lecture Notes in Computer Science. 108--119.
    [36]
    Hyung Min Park, Seok Han Lee, and Jong Soo Choi. 2008. Wearable augmented reality system using gaze interaction. Proceedings - 7th IEEE International Symposium on Mixed and Augmented Reality 2008, ISMAR 2008: 175--176.
    [37]
    J H Park, C H Shea, and D L Wright. 2000. Reduced-frequency concurrent and terminal feedback: a test of the guidance hypothesis. Journal of motor behavior 32, 3: 287--296.
    [38]
    Nicholas M. Ross and Elio M. Santos. 2014. The relative contributions of internal motor cues and external semantic cues to anticipatory smooth pursuit. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '14: 183--186.
    [39]
    Gustavo Rovelo, Donald Degraen, Davy Vanacken, Kris Luyten, and Karin Coninx. 2015. Gestu-Wan - An Intelligible Mid-Air Gesture Guidance System for Walk-up-and-Use Displays. In Proc. Interact '15, 368--386.
    [40]
    Quentin Roy, Sylvain Malacria, Yves Guiard, Eric Lecolinet, and James Eagan 2013. Augmented Letters: Mnemonic Gesture-based Shortcuts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13), 2325--2328.
    [41]
    D Rozado, T Moreno, J San Agustin, F B Rodriguez, and P Varona. 2015. Controlling a Smartphone Using Gaze Gestures as the Input Mechanism. Human-Computer Interaction 30, 1: 34--63.
    [42]
    Gary S Rubin and Kathleen Turano. 1992. Reading without saccadic eye movements. Vision Research 32, 5: 895--902.
    [43]
    Rajinder Sodhi, Hrvoje Benko, and Andrew Wilson. 2012. LightGuide: Projected Visualizations for Hand Movement Guidance. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems - CHI '12, 179--188.
    [44]
    Oleg Špakov, Poika Isokoski, Jari Kangas, Deepak Akkil, and Päivi Majaranta. 2016. Pursuit Adjuster: An Exploration into the Design Space of Smooth Pursuit - based Widgets. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA '16, 287--290.
    [45]
    Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2013. Relative accuracy measures for stroke gestures. Proceedings of the 15th ACM on International conference on multimodal interaction - ICMI '13: 279--286.
    [46]
    Radu-Daniel Vatavu, Daniel Vogel, Géry Casiez, and Laurent Grisoni. 2011. Estimating the Perceived Difficulty of Pen Gestures. In Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - INTERACT'11. Springer-Verlag Berlin, Heidelberg, 89--106.
    [47]
    Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. Proceedings of the 2016 ACM Conference on Designing Interactive Systems - DIS '16, July: 812--817.
    [48]
    Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing - UbiComp '13, 439.
    [49]
    Jacob O Wobbrock, James Rubinstein, Michael Sawyer, and Andrew T. Duchowski. 2007. Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures. Proceedings of The Conference on Communications by Gaze Interaction (COGAIN), Figure 1: 61--64.
    [50]
    Jacob O Wobbrock, Andrew D Wilson, and Yang Li. 2007. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology - UIST '07, 159.
    [51]
    Tobii. Retrieved December 8, 2016 from http://tobiigaming.com/product /

    Cited By

    View all
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2023)Boosted Gaze Gesture Recognition Using Underlying Head Orientation SequenceIEEE Access10.1109/ACCESS.2023.327028511(43675-43689)Online publication date: 2023
    • Show More Cited By

    Index Terms

    1. Designing a gaze gesture guiding system

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MobileHCI '17: Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services
      September 2017
      874 pages
      ISBN:9781450350754
      DOI:10.1145/3098279
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 04 September 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. RSVP
      2. design
      3. eye input
      4. gaze
      5. gesture
      6. guidance
      7. guide

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      MobileHCI '17
      Sponsor:

      Acceptance Rates

      MobileHCI '17 Paper Acceptance Rate 45 of 224 submissions, 20%;
      Overall Acceptance Rate 202 of 906 submissions, 22%

      Upcoming Conference

      MOBILEHCI '24
      26th International Conference on Mobile Human-Computer Interaction
      September 30 - October 3, 2024
      Melbourne , VIC , Australia

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)51
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
      • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
      • (2023)Boosted Gaze Gesture Recognition Using Underlying Head Orientation SequenceIEEE Access10.1109/ACCESS.2023.327028511(43675-43689)Online publication date: 2023
      • (2022)Dwell Selection with ML-based Intent Prediction Using Only Gaze DataProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503016:3(1-21)Online publication date: 7-Sep-2022
      • (2022)Evaluating the Performance of Machine Learning Algorithms in Gaze Gesture Recognition SystemsIEEE Access10.1109/ACCESS.2021.313615310(1020-1035)Online publication date: 2022
      • (2021)Exploring Social Acceptability and Users’ Preferences of Head- and Eye-Based Interaction with Mobile DevicesProceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia10.1145/3490632.3490636(12-23)Online publication date: 5-Dec-2021
      • (2021)StickyPie: A Gaze-Based, Scale-Invariant Marking Menu Optimized for AR/VRProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445297(1-16)Online publication date: 6-May-2021
      • (2021)Smooth Pursuit Study on an Eye-Control System for Continuous Variable Adjustment TasksInternational Journal of Human–Computer Interaction10.1080/10447318.2021.201297939:1(23-33)Online publication date: 15-Dec-2021
      • (2019)Guidance in Cinematic Virtual Reality-Taxonomy, Research Status and ChallengesMultimodal Technologies and Interaction10.3390/mti30100193:1(19)Online publication date: 19-Mar-2019
      • (2019)TigerProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3340117(1-11)Online publication date: 1-Oct-2019
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media