Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3025453.3025920acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Supporting Making Fixations and the Effect on Gaze Gesture Performance

Published: 02 May 2017 Publication History
  • Get Citation Alerts
  • Abstract

    Gaze gestures are deliberate patterns of eye movements that can be used to invoke commands. These are less reliant on accurate measurement and calibration than other gaze-based interaction techniques. These may be used with wearable displays fitted with eye tracking capability, or as part of an assistive technology. The visual stimuli in the information on the display that can act as fixation targets may or may not be sparse and will vary over time. The paper describes an experiment to investigate how the amount of information provided on a display to assist making fixations affects gaze gesture performance. The impact of providing visualization guides and small fixation targets on the time to complete gestures and error rates is presented. The number and durations of fixations made during gesture completion is used to explain differences in performance as a result of practice and direction of eye movement.

    Supplementary Material

    suppl.mov (pn3668p.mp4)
    Supplemental video

    References

    [1]
    Deepak Akkil, Andreas Lucero, Jari Kangas, Tero Jokela, Marja Salmimaa, and Roope Raisamo R. 2016. User Expectations of Everyday Gaze Interaction on Smartglasses. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16). ACM, NY, NY, USA, Article 24, 10 pages.
    [2]
    Mihai Bâce, Teemu Leppänen, David Gil de Gomez, and Argenis Ramirez Gomez. 2016. ubiGaze: ubiquitous augmented reality messaging using gaze gestures. In SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications (SA '16). ACM, NY, NY, USA, Article 11, 5 pages.
    [3]
    Wolfgang Becker. 1991. Saccades. In R. Carpenter, Vision and Visual Dysfunction, vol. 8: Eye Movements. Eye movements (pp. 95--137). London: Macmillan.
    [4]
    Nikolaus Bee, and Elisabeth Andre. 2008. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In E. André, L. Dybkjaer, W. Minker, H. Neumann, R. Pieraccini, & M. Weber, Lecture Notes in Computer Science (Vol. 5078, pp. 111--122). Berlin: Springer
    [5]
    Denis Cousineau. 2005. Confidence intervals in withinsubject designs: A simpler solution to Loftus and Masson's method. Tutorial in Quantitative Methods for Psychology, 1(1), 4--45. http://www.tqmp.org/Content/vol01--1/p042/p042.pdf
    [6]
    Murtaza Dhuliawala, Juyong Lee, Junichi Shimizu, Andreas Bulling, Kai Kunze, Thad Starner, and Woontack Woo. 2016. Smooth eye movement interaction using EOG glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI 2016). ACM, NY, NY, USA, 307--311.
    [7]
    Heiko Drewes, and Albrecht Schmidt. (2007) Interacting with the Computer Using Gaze Gestures. In: Baranauskas C., Palanque P., Abascal J., Barbosa S.D.J. (eds) Human-Computer Interaction -- INTERACT 2007. INTERACT 2007. Lecture Notes in Computer Science, vol 4663. Springer, Berlin, Heidelberg
    [8]
    Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze interaction for mobile phones. In Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology (Mobility '07). ACM, NY, NY, USA, 364--371. DOI=http://dx.doi.org/10.1145/1378063.1378122
    [9]
    Henna Heikkilä, and Kari-Jouko Räihä. 2009. Speed and accuracy of gaze gestures. Journal of Eye Movement Research, 3(2), pp. 1--14.
    [10]
    Henna Heikkilä and Kari-Jouko Räihä. 2012. Simple gaze gestures and the closure of the eyes as an interaction technique. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), Stephen N. Spencer (Ed.). ACM, NY, NY, USA, 147--154. DOI=http://dx.doi.org/10.1145/2168556.2168579
    [11]
    Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost van de Weijer. (2011). Eye tracking: a comprehensive guide to methods and measures. Oxford: Oxford University Press
    [12]
    Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on Eye tracking research & applications (ETRA '08). ACM, NY, NY, USA, 51--54.
    [13]
    Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), Stephen N. Spencer (Ed.). ACM, NY, NY, USA, 229--232. DOI=http://dx.doi.org/10.1145/2168556.2168602
    [14]
    Howell Istance, Richard Bates, Aulikki Hyrskykari, and Stephen Vickers. 2008. Snap clutch, a moded approach to solving the Midas touch problem. In Proceedings of the 2008 symposium on Eye tracking research & applications (ETRA '08). ACM, NY, NY, USA, 221--228.
    [15]
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. 2010. Designing gaze gestures for gaming: an investigation of performance. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, NY, NY, USA, 323--330. DOI=http://dx.doi.org/10.1145/1743666.1743740
    [16]
    Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze gestures and haptic feedback in mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, NY, NY, USA, 435--438.
    [17]
    Jari Kangas, Oleg Spakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Feedback for Smooth Pursuit Gaze Tracking Based Control. In Proceedings of the 7th Augmented Human International Conference 2016 (AH '16). ACM, NY, NY, USA, Article 6, 8 pages.
    [18]
    Päivi Majaranta, & Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Advances in Phys. Comp., 39--65. Springer, London.
    [19]
    Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), Stephen N. Spencer (Ed.). ACM, NY, NY, USA, 139--146. DOI=http://dx.doi.org/10.1145/2168556.2168578
    [20]
    Emilie Mollenbach, John Paulin Hansen, and Martin Lillholm. 2013. Eye Movements in Gaze Interaction. Journal of Eye Movement Research, 6(2), 1--15.
    [21]
    Emilie Mollenbach, John Paulin Hansen, Martin Lillholm, and Alastair G. Gale. 2009. Single stroke gaze gestures. In CHI '09 Extended Abstracts on Human Factors in Computing Systems (CHI EA '09). ACM, NY, NY, USA, 4555--4560.
    [22]
    Tomi Nukarinen, Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns. In Proceedings of the 9th Nordic Conference on HumanComputer Interaction (NordiCHI '16). ACM, NY, NY, USA, Article 43, 8 pages.
    [23]
    Michael Posner, and Yoav Cohen. 1984. Components of visual orienting. Attention and Performance X: Control of Language Processes. H. Bouma and D. Bonwhuis. Hillsdale, N. J., Erlbaum: 551--556
    [24]
    Keith Rayner. 1998. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 85, 618--660.
    [25]
    Keith Rayner. 1995. Eye movements and cognitive processes in reading, visual search, and scene perception. In J. M. Findlay, R. Walker, & R.W. Kentridge (Eds.), Eye movement research: Mechanisms, processes and applications (pp. 3--22). Amsterdam: North Holland.
    [26]
    David Rozado, Javier S. Agustin, Francisco B. Rodriguez, and Pablo Varona. 2012. Gliding and saccadic gaze gesture recognition in real time. ACM Trans. Interact. Intell. Syst. 1, 2, Article 10 (January 2012), 27 pages. DOI=http://dx.doi.org/10.1145/2070719.2070723
    [27]
    SensoMotoric Instruments. SMI High Performance eye tracking hmd based on HTC Vive. http://www.smivision.com/en/gaze-and-eye-trackingsystems/products/eye-tracking-htcvive.html?gclid=CMC9gzcndECFQcbaQoduK4Caw&cHash=bd18fc124ce9088b 07a26a494fd2bed7
    [28]
    Jing Tian, Howard Ying, and David Zee. 2013. Revisiting corrective saccades: role of visual feedback. Vision Research. August 30; 89: pp. 54--64
    [29]
    Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye tracking research & applications (ETRA '08). ACM, NY, NY, USA, 11--18.
    [30]
    Xianjun Sam Zheng, Cedric Foucault, Patrik Matos da Silva, Siddharth Dasari, Tao Yang, and Stuart Goose. 2015. Eye-Wearable Technology for Machine Maintenance: Effects of Display Position and Handsfree Operation. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, NY, NY, USA, 2125--2134.

    Cited By

    View all
    • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2023)Study on the brightness and graphical display object directions of the Single-Gaze-Gesture user interfaceDisplays10.1016/j.displa.2023.10253780(102537)Online publication date: Dec-2023
    • Show More Cited By

    Index Terms

    1. Supporting Making Fixations and the Effect on Gaze Gesture Performance

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
      May 2017
      7138 pages
      ISBN:9781450346559
      DOI:10.1145/3025453
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 02 May 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. fixation duration
      2. fixation targets
      3. gaze gesture performance
      4. gaze gestures

      Qualifiers

      • Research-article

      Funding Sources

      • Suomen Akatemia

      Conference

      CHI '17
      Sponsor:

      Acceptance Rates

      CHI '17 Paper Acceptance Rate 600 of 2,400 submissions, 25%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)44
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
      • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
      • (2023)Study on the brightness and graphical display object directions of the Single-Gaze-Gesture user interfaceDisplays10.1016/j.displa.2023.10253780(102537)Online publication date: Dec-2023
      • (2022)Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral VisionInternational Journal of Environmental Research and Public Health10.3390/ijerph19171073719:17(10737)Online publication date: 29-Aug-2022
      • (2022)Head and Eye Egocentric Gesture Recognition for Human-Robot Interaction Using Eyewear CamerasIEEE Robotics and Automation Letters10.1109/LRA.2022.31804427:3(7067-7074)Online publication date: Jul-2022
      • (2022)Gaze-based interactionComputers and Graphics10.1016/j.cag.2018.04.00273:C(59-69)Online publication date: 21-Apr-2022
      • (2020)Eye-based interaction in graphical systemsACM SIGGRAPH 2020 Courses10.1145/3388769.3407492(1-246)Online publication date: 17-Aug-2020
      • (2019)Inducing gaze gestures by static illustrationsProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3317956.3318151(1-5)Online publication date: 25-Jun-2019
      • (2018)Path WordProceedings of the 20th ACM International Conference on Multimodal Interaction10.1145/3242969.3243008(268-277)Online publication date: 2-Oct-2018

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media