Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3379155.3391312acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement

Published: 02 June 2020 Publication History

Abstract

Eye gaze is a fast and ergonomic modality for pointing but limited in precision and accuracy. In this work, we introduce BimodalGaze, a novel technique for seamless head-based refinement of a gaze cursor. The technique leverages eye-head coordination insights to separate natural from gestural head movement. This allows users to quickly shift their gaze to targets over larger fields of view with naturally combined eye-head movement, and to refine the cursor position with gestural head movement. In contrast to an existing baseline, head refinement is invoked automatically, and only if a target is not already acquired by the initial gaze shift. Study results show that users reliably achieve fine-grained target selection, but we observed a higher rate of initial selection errors affecting overall performance. An in-depth analysis of user performance provides insight into the classification of natural versus gestural head movement, for improvement of BimodalGaze and other potential applications.

Supplementary Material

a8-sidenmark-supplement (videofigure.mp4)
Video Figure

References

[1]
A. Terry Bahill, Michael R. Clark, and Lawrence Stark. 1975. Glissades-eye movements generated by mismatched components of the saccadic motoneuronal control signal. Mathematical Biosciences 26, 3-4 (1975), 303–318.
[2]
Richard Bates and Howell Istance. 2003. Why are Eye Mice Unpopular? A Detailed Comparison of Head and Eye Controlled Assistive Technology Pointing Devices. Universal Access in the Information Society 2, 3 (Oct. 2003), 280–290. https://doi.org/10.1007/s10209-003-0053-y
[3]
Emilio Bizzi. 1974. The coordination of eye-head movements. Scientific American 231, 4 (Oct. 1974), 100–109.
[4]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of Eye-gaze over Head-gaze-based Selection in Virtual and Augmented Reality Under Varying Field of Views. In Proceedings of the Workshop on Communication by Gaze Interaction (Warsaw, Poland) (COGAIN ’18). ACM, New York, NY, USA, Article 1, 9 pages. https://doi.org/10.1145/3206343.3206349
[5]
Stuart K. Card, Thomas P. Moran, and Allen Newell. 1980. The Keystroke-level Model for User Performance Time with Interactive Systems. Commun. ACM 23, 7 (July 1980), 396–410. https://doi.org/10.1145/358886.358895
[6]
Edward G. Freedman. 2008. Coordination of the eyes and head during visual orienting. Experimental Brain Research 190, 4 (oct 2008), 369–387. https://doi.org/10.1007/s00221-008-1504-8
[7]
Edward G. Freedman and David L. Sparks. 2000. Coordination of the eyes and head: movement kinematics. Experimental Brain Research 131, 1 (mar 2000), 22–32. https://doi.org/10.1007/s002219900296
[8]
Tovi Grossman and Ravin Balakrishnan. 2005. The Bubble Cursor: Enhancing Target Acquisition by Dynamic Resizing of the Cursor’s Activation Area. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Portland, Oregon, USA) (CHI ’05). ACM, New York, NY, USA, 281–290. https://doi.org/10.1145/1054972.1055012
[9]
Daniel Guitton and Michel Volle. 1987. Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. Journal of neurophysiology 58, 3 (1987), 427–459.
[10]
Laurent Itti, Nitin Dhavale, and Fréderic Pighin. 2006. Photorealistic attention-based gaze animation. In 2006 IEEE International Conference on Multimedia and Expo. IEEE, 521–524. https://doi.org/10.1109/ICME.2006.262440
[11]
Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC Pointing for Eyewear Computers. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (Osaka, Japan) (ISWC ’15). ACM, New York, NY, USA, 155–158. https://doi.org/10.1145/2802083.2802094
[12]
Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, and Hendrik Koesling. 2010. Visual search in the (un)real world: how head-mounted displays affect eye movements, head movements and target detection. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (Austin, Texas) (ETRA ’10). ACM, New York, NY, USA, 121–124. https://doi.org/10.1145/1743666.1743696
[13]
Andrew Kurauchi, Wenxin Feng, Carlos Morimoto, and Margrit Betke. 2015. HMAGIC: Head Movement and Gaze Input Cascaded Pointing. In Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments (Corfu, Greece) (PETRA ’15). ACM, New York, NY, USA, Article 47, 4 pages. https://doi.org/10.1145/2769493.2769550
[14]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). ACM, New York, NY, USA, Article 81, 14 pages. https://doi.org/10.1145/3173574.3173655
[15]
Michael Land and Benjamin Tatler. 2012. Looking and Acting: Vision and Eye Movements in Natural Behaviour. Oxford University Press, United Kingdom. https://doi.org/10.1093/acprof:oso/9780198570943.001.0001
[16]
Michael F. Land. 2004. The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations. Experimental Brain Research 159, 2 (01 Nov 2004), 151–160. https://doi.org/10.1007/s00221-004-1951-9
[17]
Chris Lankford. 2000. Effective Eye-gaze Input into Windows. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (Palm Beach Gardens, Florida, USA) (ETRA ’00). ACM, New York, NY, USA, 23–27. https://doi.org/10.1145/355017.355021
[18]
Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (Charlotte, NC, USA) (UIST ’15). ACM, New York, NY, USA, 385–394. https://doi.org/10.1145/2807442.2807461
[19]
Diako Mardanbegi, Tobias Langlotz, and Hans Gellersen. 2019a. Resolving Target Ambiguity in 3D Gaze Interaction Through VOR Depth Estimation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). ACM, New York, NY, USA, Article 612, 12 pages. https://doi.org/10.1145/3290605.3300842
[20]
Diako Mardanbegi, Ken Pfeuffer, Alexander Perzl, Benedikt Mayer, Shahram Jalaliniya, and Hans Gellersen. 2019b. EyeSeeThrough: Unifying Tool Selection and Application in Virtual Environments. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 474–483. https://doi.org/10.1109/VR.2019.8797988
[21]
Tomislav Pejsa, Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2016. Authoring Directed Gaze for Full-body Motion Capture. ACM Trans. Graph. 35, 6, Article 161 (Nov. 2016), 11 pages. https://doi.org/10.1145/2980179.2982444
[22]
Kevin Pfeil, Eugene M. Taranta, II, Arun Kulshreshth, Pamela Wisniewski, and Joseph J. LaViola, Jr.2018. A Comparison of Eye-head Coordination Between Virtual and Physical Realities. In Proceedings of the 15th ACM Symposium on Applied Perception (Vancouver, British Columbia, Canada) (SAP ’18). ACM, New York, NY, USA, Article 18, 7 pages. https://doi.org/10.1145/3225153.3225157
[23]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology(Charlotte, NC, USA) (UIST ’15). ACM, New York, NY, USA, 373–383. https://doi.org/10.1145/2807442.2807460
[24]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (Tokyo, Japan) (UIST ’16). ACM, New York, NY, USA, 301–311. https://doi.org/10.1145/2984511.2984514
[25]
Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. 2010. ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (Austin, Texas) (ETRA ’10). ACM, New York, NY, USA, 331–337. https://doi.org/10.1145/1743666.1743741
[26]
Yuan Yuan Qian and Robert J. Teather. 2017. The Eyes Don’T Have It: An Empirical Comparison of Head-based and Eye-based Selection in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (Brighton, United Kingdom) (SUI ’17). ACM, New York, NY, USA, 91–98. https://doi.org/10.1145/3131277.3132182
[27]
K. Ruhland, C. E. Peters, S. Andrist, J. B. Badler, N. I. Badler, M. Gleicher, B. Mutlu, and R. McDonnell. 2015. A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception. Computer Graphics Forum 34, 6 (Sept. 2015), 299–326. https://doi.org/10.1111/cgf.12603
[28]
Ludwig Sidenmark and Hans Gellersen. 2019a. Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. ACM Trans. Comput.-Hum. Interact. 27, 1, Article 4 (Dec. 2019), 40 pages. https://doi.org/10.1145/3361218
[29]
Ludwig Sidenmark and Hans Gellersen. 2019b. Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. In Proceedings of the 32Nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). ACM, New York, NY, USA, 1161–1174. https://doi.org/10.1145/3332165.3347921
[30]
Oleg Špakov, Poika Isokoski, and Päivi Majaranta. 2014. Look and Lean: Accurate Head-assisted Eye Pointing. In Proceedings of the Symposium on Eye Tracking Research and Applications (Safety Harbor, Florida) (ETRA ’14). ACM, New York, NY, USA, 35–42. https://doi.org/10.1145/2578153.2578157
[31]
John S. Stahl. 1999. Amplitude of human head movements associated with horizontal saccades. Experimental Brain Research 126, 1 (apr 1999), 41–54. https://doi.org/10.1007/s002210050715
[32]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI ’12). ACM, New York, NY, USA, 2981–2990. https://doi.org/10.1145/2207676.2208709
[33]
D. Tweed, B. Glenn, and T. Vilis. 1995. Eye-head Coordination during Large Gaze Shifts. Journal of Neurophysiology 73, 2 (Feb. 1995), 766–779. https://doi.org/10.1152/jn.1995.73.2.766
[34]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Pittsburgh, Pennsylvania, USA) (CHI ’99). ACM, New York, NY, USA, 246–253. https://doi.org/10.1145/302979.303053

Cited By

View all
  • (2024)Magilock: a reliable control triggering method in multi-channel eye-control systemsFrontiers in Human Neuroscience10.3389/fnhum.2024.136583818Online publication date: 22-Mar-2024
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)Unveiling Variations: A Comparative Study of VR Headsets Regarding Eye Tracking Volume, Gaze Accuracy, and Precision2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00127(650-655)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Full Papers: ACM Symposium on Eye Tracking Research and Applications
June 2020
214 pages
ISBN:9781450371339
DOI:10.1145/3379155
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Eye tracking
  2. Eye-head coordination
  3. Gaze interaction
  4. Refinement
  5. Virtual reality

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)106
  • Downloads (Last 6 weeks)9
Reflects downloads up to 28 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Magilock: a reliable control triggering method in multi-channel eye-control systemsFrontiers in Human Neuroscience10.3389/fnhum.2024.136583818Online publication date: 22-Mar-2024
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)Unveiling Variations: A Comparative Study of VR Headsets Regarding Eye Tracking Volume, Gaze Accuracy, and Precision2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00127(650-655)Online publication date: 16-Mar-2024
  • (2024)EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00088(681-689)Online publication date: 16-Mar-2024
  • (2024)Continuous Prediction of Pointing Targets With Motion and Eye-Tracking in Virtual RealityIEEE Access10.1109/ACCESS.2024.335078812(5933-5946)Online publication date: 2024
  • (2024)MazeMind: Exploring the Effects of Hand Gestures and Eye Gazing on Cognitive Load and Task Efficiency in an Augmented Reality EnvironmentDesign Computing and Cognition’2410.1007/978-3-031-71922-6_7(105-120)Online publication date: 28-Sep-2024
  • (2023)Vision-Based Interfaces for Character-Based Text EntryAdvances in Human-Computer Interaction10.1155/2023/88557642023Online publication date: 1-Jan-2023
  • (2023)GE-Simulator: An Open-Source Tool for Simulating Real-Time Errors for HMD-based Eye TrackersProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588417(1-6)Online publication date: 30-May-2023
  • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
  • (2023)Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of InputProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581201(1-14)Online publication date: 19-Apr-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media