Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3385956.3418962acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Empirical Evaluation of Gaze-enhanced Menus in Virtual Reality

Published: 01 November 2020 Publication History

Abstract

Many user interfaces involve attention shifts between primary and secondary tasks, e.g., when changing a mode in a menu, which detracts the user from their main task. In this work, we investigate how eye gaze input affords exploiting the attention shifts to enhance the interaction with handheld menus. We assess three techniques for menu selection: dwell time, gaze button, and cursor. Each represents a different multimodal balance between gaze and manual input. We present a user study that compares the techniques against two manual baselines (dunk brush, pointer) in a compound colour selection and line drawing task. We show that user performance with the gaze techniques is comparable to pointer-based menu selection, with less physical effort. Furthermore, we provide an analysis of the trade-off as each technique strives for a unique balance between temporal, manual, and visual interaction properties. Our research points to new opportunities for integrating multimodal gaze in menus and bimanual interfaces in 3D environments.

References

[1]
Rahul Arora, Rubaiat Habib Kazi, Fraser Anderson, Tovi Grossman, Karan Singh, and George Fitzmaurice. 2017. Experimental Evaluation of Sketching on Surfaces in VR. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). ACM, New York, NY, USA, 5643–5654. https://doi.org/10.1145/3025453.3025474
[2]
Eric A Bier, Maureen C Stone, Ken Pier, William Buxton, and Tony D DeRose. 1993. Toolglass and magic lenses: the see-through interface. In Proceedings of the 20th annual conference on Computer graphics and interactive techniques. ACM, 73–80.
[3]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of Eye-gaze over Head-gaze-based Selection in Virtual and Augmented Reality Under Varying Field of Views. In Proceedings of the Workshop on Communication by Gaze Interaction (Warsaw, Poland) (COGAIN ’18). ACM, New York, NY, USA, Article 1, 9 pages. https://doi.org/10.1145/3206343.3206349
[4]
Richard A Bolt. 1981. Gaze-orchestrated dynamic windows. In ACM SIGGRAPH Computer Graphics, Vol. 15. ACM, 109–119.
[5]
Doug A Bowman and Chadwick A Wingrave. 2001. Design and evaluation of menu systems for immersive virtual environments. In Proceedings IEEE Virtual Reality 2001. IEEE, 149–156.
[6]
Doug A Bowman, Chadwick A Wingrave, JM Campbell, VQ Ly, and CJ Rhoton. 2002. Novel uses of Pinch Gloves TM for virtual environment interaction techniques. Virtual Reality 6, 3 (2002), 122–129.
[7]
Andreas Bulling. 2016. Pervasive attentive user interfaces. Computer1(2016), 94–98.
[8]
Ishan Chatterjee, Robert Xiao, and Chris Harrison. 2015. Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions. In Proc. International Conference on Multimodal Interaction. ACM, 131–138.
[9]
Nathan Cournia, John D Smith, and Andrew T Duchowski. 2003. Gaze- vs. hand-based pointing in virtual environments. In CHI Extended Abstracts. 772–773.
[10]
Raimund Dachselt and Anett Hübner. 2006. A Survey and Taxonomy of 3D Menu Techniques. In EGVE, Vol. 6. 89–99.
[11]
Heiko Drewes and Albrecht Schmidt. 2009. The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse. In Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II(Uppsala, Sweden) (INTERACT ’09). Springer-Verlag, Berlin, Heidelberg, 415–428. https://doi.org/10.1007/978-3-642-03658-3_46
[12]
Augusto Esteves, Yonghwan Shin, and Ian Oakley. 2020. Comparing selection mechanisms for gaze input techniques in head-mounted displays. International Journal of Human-Computer Studies 139 (2020), 102414.
[13]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology(UIST ’15). ACM, New York, NY, USA, 457–466. https://doi.org/10.1145/2807442.2807499
[14]
Dominique Gerber and Dominique Bechmann. 2004. Design and evaluation of the ring menu in virtual environments. Immersive projection technologies(2004).
[15]
Fabian Göbel, Konstantin Klamka, Andreas Siegel, Stefan Vogt, Sophie Stellmach, and Raimund Dachselt. 2013. Gaze-supported foot interaction in zoomable information spaces. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. ACM, 3059–3062.
[16]
John Paulin Hansen, Vijay Rajanna, I Scott MacKenzie, and Per Bækgaard. 2018. A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 7.
[17]
Ken Hinckley and Daniel Wigdor. 2002. Input technologies and techniques. The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications(2002), 151–168.
[18]
Teresa Hirzle, Jan Gugenheimer, Florian Geiselhart, Andreas Bulling, and Enrico Rukzio. 2019. A Design Space for Gaze Interaction on Head-Mounted Displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 625.
[19]
Thomas E Hutchinson, K Preston White, Worthy N Martin, Kelly C Reichert, and Lisa A Frey. 1989. Human-computer interaction using eye-gaze input. IEEE Transactions on systems, man, and cybernetics 19, 6(1989), 1527–1534.
[20]
Howell Istance, Richard Bates, Aulikki Hyrskykari, and Stephen Vickers. 2008. Snap clutch, a moded approach to solving the Midas touch problem. In Proceedings of the 2008 symposium on Eye tracking research & applications. 221–228.
[21]
Robert JK Jacob. 1993. Eye movement-based human-computer interaction techniques: Toward non-command interfaces. Advances in human-computer interaction 4 (1993), 151–190.
[22]
Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Seattle, Washington, USA) (CHI ’90). ACM, New York, NY, USA, 11–18. https://doi.org/10.1145/97243.97246
[23]
Gordon Kurtenbach, George Fitzmaurice, Thomas Baudel, and Bill Buxton. 1997. The Design of a GUI Paradigm Based on Tablets, Two-Hands, and Transparency. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI ’97). Association for Computing Machinery, New York, NY, USA, 35–42. https://doi.org/10.1145/258549.258574
[24]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 81.
[25]
Joseph LaViola and Robert Zeleznik. 1999. Flex and pinch: A case study of whole hand input design for virtual environment interaction. In Proceedings of the second IASTED international conference on computer graphics and imaging. 221–225.
[26]
Yang Li, Ken Hinckley, Ken Hinckley, Zhiwei Guan, and James A. Landay. 2005. Experimental Analysis of Mode Switching Techniques in Pen-based User Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Portland, Oregon, USA) (CHI ’05). ACM, New York, NY, USA, 461–470. https://doi.org/10.1145/1054972.1055036
[27]
Francisco Lopez Luro and Veronica Sundstedt. 2019. A Comparative Study of Eye Tracking and Hand Controller for Aiming Tasks in Virtual Reality. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA ’19). ACM, New York, NY, USA, Article 68, 9 pages. https://doi.org/10.1145/3317956.3318153
[28]
Päivi Majaranta, Scott MacKenzie, Anne Aula, and Kari-Jouko Räihä. 2006. Effects of Feedback and Dwell Time on Eye Typing Speed and Accuracy. Univers. Access Inf. Soc. 5, 2 (July 2006), 10. https://doi.org/10.1007/s10209-006-0034-z
[29]
D. Mardanbegi, B. Mayer, K. Pfeuffer, S. Jalaliniya, H. Gellersen, and A. Perzl. 2019. EyeSeeThrough: Unifying Tool Selection and Application in Virtual Environments. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 474–483.
[30]
Mark R Mine, Frederick P Brooks Jr, and Carlo H Sequin. 1997. Moving objects in space: exploiting proprioception in virtual-environment interaction. In SIGGRAPH, Vol. 97. 19–26.
[31]
Aunnoy K. Mutasim, Wolfgang Stuerzlinger, and Anil Ufuk Batmaz. 2020. Gaze Tracking for Eye-Hand Coordination Training Systems in Virtual Reality. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI EA ’20). Association for Computing Machinery, New York, NY, USA, 1–9. https://doi.org/10.1145/3334480.3382924
[32]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology(Honolulu, Hawaii, USA) (UIST ’14). ACM, New York, NY, USA, 509–518. https://doi.org/10.1145/2642918.2647397
[33]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology(Charlotte, NC, USA) (UIST ’15). ACM, New York, NY, USA, 373–383. https://doi.org/10.1145/2807442.2807460
[34]
Ken Pfeuffer, Matthias J. Geiger, Sarah Prange, Lukas Mecke, Daniel Buschek, and Florian Alt. 2019. Behavioural Biometrics in VR: Identifying People from Body Motion and Relations in Virtual Reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300340
[35]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (Tokyo, Japan) (UIST ’16). Association for Computing Machinery, New York, NY, USA, 301–311. https://doi.org/10.1145/2984511.2984514
[36]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze+ pinch interaction in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction. ACM, 99–108.
[37]
Wayne Piekarski and Bruce H Thomas. 2002. Tinmith-hand: Unified user interface technology for mobile outdoor augmented reality and indoor virtual reality. In Proceedings IEEE Virtual Reality 2002. IEEE, 287–288.
[38]
Thammathip Piumsomboon, Gun Lee, Robert W Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE.
[39]
Matti Pouke, Antti Karhu, Seamus Hickey, and Leena Arhippainen. 2012. Gaze Tracking and Non-touch Gesture Based Interaction Method for Mobile 3D Virtual Spaces. In Proc. 24th Australian Computer-Human Interaction Conference (Melbourne, Australia) (OzCHI ’12). ACM, New York, USA, 505–512. https://doi.org/10.1145/2414536.2414614
[40]
Ivan Poupyrev and Tadao Ichikawa. 1999. Manipulating objects in virtual worlds: Categorization and empirical evaluation of interaction techniques. Journal of Visual Languages & Computing 10, 1 (1999), 19–35.
[41]
Yuan Yuan Qian and Robert J. Teather. 2017. The Eyes Don’T Have It: An Empirical Comparison of Head-based and Eye-based Selection in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (Brighton, United Kingdom) (SUI ’17). ACM, New York, NY, USA, 91–98. https://doi.org/10.1145/3131277.3132182
[42]
Radiah Rivu, Yasmeen Abdrabou, Ken Pfeuffer, Augusto Esteves, Stefanie Meitner, and Florian Alt. 2020. StARe: Gaze-Assisted Face-to-Face Communication in Augmented Reality. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Adjunct). Association for Computing Machinery, New York, NY, USA, Article 14, 5 pages. https://doi.org/10.1145/3379157.3388930
[43]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proc. SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). ACM, New York, USA, 281–288. https://doi.org/10.1145/332040.332445
[44]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. (2019).
[45]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In CHI ’12 (Austin, Texas, USA). ACM, 2981–2990. https://doi.org/10.1145/2207676.2208709
[46]
Hemant Bhaskar Surale, Fabrice Matulic, and Daniel Vogel. 2017. Experimental Analysis of Mode Switching Techniques in Touch-based User Interfaces. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). ACM, New York, NY, USA, 3267–3280. https://doi.org/10.1145/3025453.3025865
[47]
Hemant Bhaskar Surale, Fabrice Matulic, and Daniel Vogel. 2019. Experimental Analysis of Barehand Mid-air Mode-Switching Techniques in Virtual Reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 196.
[48]
Vildan Tanriverdi and Robert JK Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 265–272.
[49]
Roel Vertegaal 2003. Attentive user interfaces. Commun. ACM 46, 3 (2003), 30–33.
[50]
Robert C Zeleznik, Andrew S Forsberg, and Jürgen P Schulze. 2005. Look-that-there: Exploiting gaze in virtual reality interactions. Technical report, Technical Report CS-05(2005).
[51]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 246–253.

Cited By

View all
  • (2025)Oculomotor Plant Mathematical Model in Kalman Filter Form With Peak Velocity-Based Neural Pulse for Continuous Gaze PredictionIEEE Access10.1109/ACCESS.2025.352810413(11544-11559)Online publication date: 2025
  • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
  • (2024)Gaze, Wall, and Racket: Combining Gaze and Hand-Controlled Plane for 3D Selection in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981348:ISS(189-213)Online publication date: 24-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '20: Proceedings of the 26th ACM Symposium on Virtual Reality Software and Technology
November 2020
429 pages
ISBN:9781450376198
DOI:10.1145/3385956
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 November 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Design
  2. Gaze
  3. Manual input
  4. Menu
  5. Pointing
  6. Virtual Reality

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

VRST '20

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)203
  • Downloads (Last 6 weeks)13
Reflects downloads up to 02 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Oculomotor Plant Mathematical Model in Kalman Filter Form With Peak Velocity-Based Neural Pulse for Continuous Gaze PredictionIEEE Access10.1109/ACCESS.2025.352810413(11544-11559)Online publication date: 2025
  • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
  • (2024)Gaze, Wall, and Racket: Combining Gaze and Hand-Controlled Plane for 3D Selection in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981348:ISS(189-213)Online publication date: 24-Oct-2024
  • (2024)Experimental Analysis of Freehand Multi-object Selection Techniques in Virtual Reality Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981298:ISS(93-111)Online publication date: 24-Oct-2024
  • (2024)Methods for Evaluating Immersive 3D Virtual Environments: a Systematic Literature ReviewProceedings of the 26th Symposium on Virtual and Augmented Reality10.1145/3691573.3691595(140-151)Online publication date: 30-Sep-2024
  • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
  • (2024)Hands-on, Hands-off: Gaze-Assisted Bimanual 3D InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676331(1-12)Online publication date: 13-Oct-2024
  • (2024)Between Wearable and Spatial Computing: Exploring Four Interaction Techniques at the Intersection of Smartwatches and Head-mounted DisplaysProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656365(1-7)Online publication date: 4-Jun-2024
  • (2024)Evaluation of Eye Tracking Signal Quality for Virtual Reality Applications: A Case Study in the Meta Quest ProProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653347(1-8)Online publication date: 4-Jun-2024
  • (2024)Keep Your Eyes on the Target: Enhancing Immersion and Usability by Designing Natural Object Throwing with Gaze-based TargetingProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653338(1-7)Online publication date: 4-Jun-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media