Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3491102.3517725acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Kuiper Belt: Utilizing the “Out-of-natural Angle” Region in the Eye-gaze Interaction for Virtual Reality

Published: 29 April 2022 Publication History

Abstract

The maximum physical range of horizontal human eye movement is approximately 45°. However, in a natural gaze shift, the difference in the direction of the gaze relative to the frontal direction of the head rarely exceeds 25°. We name this region of 25° − 45° the “Kuiper Belt” in the eye-gaze interaction. We try to utilize this region to solve the Midas touch problem to enable a search task while reducing false input in the Virtual Reality environment. In this work, we conduct two studies to figure out the design principle of how we place menu items in the Kuiper Belt as an “out-of-natural angle” region of the eye-gaze movement, and determine the effectiveness and workload of the Kuiper Belt-based method. The results indicate that the Kuiper Belt-based method facilitated the visual search task while reducing false input. Finally, we present example applications utilizing the findings of these studies.

Supplementary Material

Supplemental Materials (3491102.3517725-supplemental-materials.zip)
MP4 File (3491102.3517725-talk-video.mp4)
Talk Video
MP4 File (3491102.3517725-video-preview.mp4)
Video Preview

References

[1]
Richard Abadi and Columba Scallan. 2001. Ocular oscillation on eccentric gaze. Vision research 41 (11 2001), 2895–907. https://doi.org/10.1016/S0042-6989(01)00168-7
[2]
Marc Baloup, Thomas Pietrzak, and Géry Casiez. 2019. RayCursor: A 3D Pointing Facilitation Technique Based on Raycasting. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300331
[3]
Sebastian Boring, Marko Jurmu, and Andreas Butz. 2009. Scroll, Tilt or Move It: Using Mobile Phones to Continuously Control Pointers on Large Public Displays. In Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7 (Melbourne, Australia) (OZCHI ’09). Association for Computing Machinery, New York, NY, USA, 161–168. https://doi.org/10.1145/1738826.1738853
[4]
John Brooke. 1996. SUS : A Quick and Dirty Usability Scale. Usability Evaluation in Industry(1996), 189–194.
[5]
Andrea Colaço, Ahmed Kirmani, Hye Soo Yang, Nan-Wei Gong, Chris Schmandt, and Vivek K. Goyal. 2013. Mime: Compact, Low Power 3D Gesture Sensing for Interaction with Head Mounted Displays. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (St. Andrews, Scotland, United Kingdom) (UIST ’13). Association for Computing Machinery, New York, NY, USA, 227–236. https://doi.org/10.1145/2501988.2502042
[6]
Nathan Cournia, John D. Smith, and Andrew T. Duchowski. 2003. Gaze- vs. Hand-Based Pointing in Virtual Environments. In CHI ’03 Extended Abstracts on Human Factors in Computing Systems (Ft. Lauderdale, Florida, USA) (CHI EA ’03). Association for Computing Machinery, New York, NY, USA, 772–773. https://doi.org/10.1145/765891.765982
[7]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. In Proceedings of the 11th IFIP TC 13 International Conference on Human-Computer Interaction - Volume Part II (Rio de Janeiro, Brazil) (INTERACT’07). Springer-Verlag, Berlin, Heidelberg, 475–488.
[8]
Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (Québec City, QC, Canada) (UIST ’17). Association for Computing Machinery, New York, NY, USA, 167–178. https://doi.org/10.1145/3126594.3126616
[9]
Steven Feiner, Blair MacIntyre, Marcus Haupt, and Eliot Solomon. 1993. Windows on the World: 2D Windows for 3D Augmented Reality. In Proceedings of the 6th Annual ACM Symposium on User Interface Software and Technology (Atlanta, Georgia, USA) (UIST ’93). Association for Computing Machinery, New York, NY, USA, 145–155. https://doi.org/10.1145/168642.168657
[10]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 1118–1130. https://doi.org/10.1145/3025453.3025599
[11]
Tom Foulsham, Esther Walker, and Alan Kingstone. 2011. To Where, What and When of Gaze Allocation in the Lab and the Natural Environment. Vision research 51 (07 2011), 1920–31. https://doi.org/10.1016/j.visres.2011.07.002
[12]
Edward G. Freedman. 2008. Coordination of the Eyes and Head during Visual Orienting. Experimental Brain Research 190 (2008), 369–387.
[13]
Jeroen Goossens and John Opstal. 1997. Human Eye–Head Coordination in Two Dimensions under Different Sensorimotor Conditions. Experimental brain research 114 (06 1997), 542–60. https://doi.org/10.1007/PL00005663
[14]
Jan Gugenheimer, David Dobbelstein, Christian Winkler, Gabriel Haas, and Enrico Rukzio. 2016. FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (Tokyo, Japan) (UIST ’16). Association for Computing Machinery, New York, NY, USA, 49–60. https://doi.org/10.1145/2984511.2984576
[15]
Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload, Peter A. Hancock and Najmedin Meshkati (Eds.). Advances in Psychology, Vol. 52. North-Holland, 139 – 183. https://doi.org/10.1016/S0166-4115(08)62386-9
[16]
Benjamin Hatscher, Maria Luz, Lennart E. Nacke, Norbert Elkmann, Veit Müller, and Christian Hansen. 2017. GazeTap: Towards Hands-Free Interaction in the Operating Room(ICMI ’17). Association for Computing Machinery, New York, NY, USA, 243–251. https://doi.org/10.1145/3136755.3136759
[17]
James J. Higgins and Suleiman Tashtoush. 1994. An aligned rank transform test for interaction.Nonlinear World 1, 2 (1994), 201 – 211.
[18]
Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance: A Metric to Quantify Arm Fatigue of Mid-Air Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 1063–1072. https://doi.org/10.1145/2556288.2557130
[19]
Sture Holm. 1979. A Simple Sequentially Rejective Multiple Test Procedure. Scandinavian Journal of Statistics 6, 2 (1979), 65 – 70.
[20]
Zhiming. Hu, Sheng. Li, Congyi. Zhang, Kangrui. Yi, Guoping. Wang, and Dinesh. Manocha. 2020. DGaze: CNN-Based Gaze Prediction in Dynamic Scenes. IEEE Transactions on Visualization and Computer Graphics 26, 5(2020), 1902–1911.
[21]
Yoshio Ishiguro and Jun Rekimoto. 2011. Peripheral Vision Annotation: Noninterference Information Presentation Method for Mobile Augmented Reality. In Proceedings of the 2nd Augmented Human International Conference (Tokyo, Japan) (AH ’11). Association for Computing Machinery, New York, NY, USA, Article 8, 5 pages. https://doi.org/10.1145/1959826.1959834
[22]
Poika Isokoski. 2000. Text Input Methods for Eye Trackers Using Off-Screen Targets. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (Palm Beach Gardens, Florida, USA) (ETRA ’00). Association for Computing Machinery, New York, NY, USA, 15–21. https://doi.org/10.1145/355017.355020
[23]
Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Seattle, Washington, USA) (CHI ’90). Association for Computing Machinery, New York, NY, USA, 11–18. https://doi.org/10.1145/97243.97246
[24]
Jorge Jimenez, Diego Gutierrez, and Pedro Latorre. 2008. Gaze-based Interaction for Virtual Environments. Journal of Universal Computer Science 14, 19 (nov 2008), 3085–3098.
[25]
Manu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, and Andreas Paepcke. 2008. Improving the Accuracy of Gaze Input for Interaction. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (Savannah, Georgia) (ETRA ’08). Association for Computing Machinery, New York, NY, USA, 65–68. https://doi.org/10.1145/1344471.1344488
[26]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’07). Association for Computing Machinery, New York, NY, USA, 421–430. https://doi.org/10.1145/1240624.1240692
[27]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, Article 81, 14 pages. https://doi.org/10.1145/3173574.3173655
[28]
Wallace Lages and Doug Bowman. 2019. Adjustable Adaptation for Spatial Augmented Reality Workspaces. In Symposium on Spatial User Interaction (New Orleans, LA, USA) (SUI ’19). Association for Computing Machinery, New York, NY, USA, Article 20, 2 pages. https://doi.org/10.1145/3357251.3358755
[29]
Jiwon Lee, Mingyu Kim, Changyu Jeon, and Kim Jinmo. 2017. A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment. Symmetry 9 (09 2017), 189. https://doi.org/10.3390/sym9090189
[30]
Feiyu Lu, Shakiba Davari, Lee Lisle, Yuan Li, and Doug A. Bowman. 2020. Glanceable AR: Evaluating Information Access Methods for Head-Worn Augmented Reality. In 2020 IEEE Symposium on 3D User Interfaces (3DUI). 930–938.
[31]
Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (Charlotte, NC, USA) (UIST ’15). Association for Computing Machinery, New York, NY, USA, 385–394. https://doi.org/10.1145/2807442.2807461
[32]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast Gaze Typing with an Adjustable Dwell Time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). Association for Computing Machinery, New York, NY, USA, 357–360. https://doi.org/10.1145/1518701.1518758
[33]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty Years of Eye Typing: Systems and Design Issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (New Orleans, Louisiana) (ETRA ’02). Association for Computing Machinery, New York, NY, USA, 15–22. https://doi.org/10.1145/507072.507076
[34]
Päivi Majaranta and Kari-Jouko Räihä. 2007. Text Entry by Gaze: Utilizing Eye-Tracking. 175–187.
[35]
Diako. Mardanbegi, Benedikt. Mayer, Ken. Pfeuffer, Shahram. Jalaliniya, Hans. Gellersen, and Alexander. Perzl. 2019. EyeSeeThrough: Unifying Tool Selection and Application in Virtual Environments. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 474–483.
[36]
Pranav Mistry and Pattie Maes. 2009. SixthSense: A Wearable Gestural Interface. In ACM SIGGRAPH ASIA 2009 Sketches (Yokohama, Japan) (SIGGRAPH ASIA ’09). Association for Computing Machinery, New York, NY, USA, Article 11, 1 pages. https://doi.org/10.1145/1667146.1667160
[37]
Martez E. Mott, Shane Williams, Jacob O. Wobbrock, and Meredith Ringel Morris. 2017. Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 2558–2570. https://doi.org/10.1145/3025453.3025517
[38]
Florian Müller, Joshua McManus, Sebastian Günther, Martin Schmitz, Max Mühlhäuser, and Markus Funk. 2019. Mind the Tap: Assessing Foot-Taps for Interacting with Head-Mounted Displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, Article 477, 13 pages. https://doi.org/10.1145/3290605.3300707
[39]
Abdul Moiz Penkar, Christof Lutteroth, and Gerald Weber. 2012. Designing for the Eye: Design Parameters for Dwell in Gaze Interaction. In Proceedings of the 24th Australian Computer-Human Interaction Conference (Melbourne, Australia) (OzCHI ’12). Association for Computing Machinery, New York, NY, USA, 479–488. https://doi.org/10.1145/2414536.2414609
[40]
Ken Pfeuffer, Yasmeen Abdrabou, Augusto Esteves, Radiah Rivu, Yomna Abdelrahman, Stefanie Meitner, Amr Saadi, and Florian Alt. 2021. ARtention: A design space for gaze-adaptive user interfaces in augmented reality. Computers & Graphics 95(2021), 1–12. https://doi.org/10.1016/j.cag.2021.01.001
[41]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + Pinch Interaction in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (Brighton, United Kingdom) (SUI ’17). Association for Computing Machinery, New York, NY, USA, 99–108. https://doi.org/10.1145/3131277.3132180
[42]
Jimin Pi and Bertram E. Shi. 2017. Probabilistic Adjustment of Dwell Time for Eye Typing. In 2017 10th International Conference on Human System Interactions (HSI). 251–257.
[43]
Argenis Ramirez Ramirez Gomez, Christopher Clarke, Ludwig Sidenmark, and Hans Gellersen. 2021. Gaze+Hold: Eyes-Only Direct Manipulation with Continuous Gaze Modulated by Closure of One Eye. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA ’21 Full Papers). Association for Computing Machinery, New York, NY, USA, Article 10, 12 pages. https://doi.org/10.1145/3448017.3457381
[44]
Houssem Saidi, Emmanuel Dubois, and Marcos Serrano. 2021. HoloBar: Rapid Command Execution for Head-Worn AR Exploiting Around the Field-of-View Interaction. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3411764.3445255
[45]
K. C. Salter and R. F Fawcett. 1993. The art test of interaction: a robust and powerful rank test of interaction in factorial models. Communications in Statistics - Simulation and Computation 22, 1(1993), 137–153. https://doi.org/10.1080/03610919308813085
[46]
Simon Schenk, Philipp Tiefenbacher, Gerhard Rigoll, and Michael Dorr. 2016. SPOCK: A Smooth Pursuit Oculomotor Control Kit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (San Jose, California, USA) (CHI EA ’16). Association for Computing Machinery, New York, NY, USA, 2681–2687. https://doi.org/10.1145/2851581.2892291
[47]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). Association for Computing Machinery, New York, NY, USA, 281–288. https://doi.org/10.1145/332040.332445
[48]
Ludwig Sidenmark, Christopher Clarke, Xuesong Zhang, Jenny Phu, and Hans Gellersen. 2020. Outline Pursuits: Gaze-Assisted Selection of Occluded Objects in Virtual Reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376438
[49]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye & Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1161–1174. https://doi.org/10.1145/3332165.3347921
[50]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. ACM Trans. Comput.-Hum. Interact. 27, 1, Article 4 (Dec. 2019), 40 pages. https://doi.org/10.1145/3361218
[51]
Ludwig Sidenmark, Diako Mardanbegi, Argenis Ramirez Gomez, Christopher Clarke, and Hans Gellersen. 2020. BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement. In Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20). Association for Computing Machinery, New York, NY, USA, Article 8, 9 pages. https://doi.org/10.1145/3379155.3391312
[52]
Ludwig Sidenmark, Dominic Potts, Bill Bapisch, and Hans Gellersen. 2021. Radi-Eye: Hands-Free Radial Interfaces for 3D Interaction Using Gaze-Activated Head-Crossing. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3411764.3445697
[53]
John Stahl. 2001. Eye-Head Coordination and the Variation of Eye-Movement Accuracy with Orbital Eccentricity. Experimental brain research. Experimentelle Hirnforschung. Expérimentation cérébrale 136 (02 2001), 200–10. https://doi.org/10.1007/s002210000593
[54]
John S. Stahl. 1999. Amplitude of Human Head Movements Associated with Horizontal Saccades. Experimental Brain Research 126 (1999), 41–54.
[55]
Veikko Surakka, Marko Illi, and Poika Isokoski. 2004. Gazing and Frowning as a New Human–Computer Interaction Technique. ACM Trans. Appl. Percept. 1, 1 (July 2004), 40–56. https://doi.org/10.1145/1008722.1008726
[56]
Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with Eye Movements in Virtual Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). Association for Computing Machinery, New York, NY, USA, 265–272. https://doi.org/10.1145/332040.332443
[57]
Marcus Tönnis and Gudrun Klinker. 2014. Boundary Conditions for Information Visualization with Respect to the User’s Gaze. In Proceedings of the 5th Augmented Human International Conference (Kobe, Japan) (AH ’14). Association for Computing Machinery, New York, NY, USA, Article 44, 8 pages. https://doi.org/10.1145/2582051.2582095
[58]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Zurich, Switzerland) (UbiComp ’13). Association for Computing Machinery, New York, NY, USA, 439–448. https://doi.org/10.1145/2493432.2493477
[59]
Oleg Špakov and Darius Miniotas. 2004. On-Line Adjustment of Dwell Time for Target Selection by Gaze. In Proceedings of the Third Nordic Conference on Human-Computer Interaction (Tampere, Finland) (NordiCHI ’04). Association for Computing Machinery, New York, NY, USA, 203–206. https://doi.org/10.1145/1028014.1028045
[60]
Colin Ware and Harutune H. Mikaelian. 1986. An Evaluation of an Eye Tracker as a Device for Computer Input. SIGCHI Bull. 18, 4 (May 1986), 183–188. https://doi.org/10.1145/1165387.275627
[61]
Jacob O. Wobbrock, Leah Findlater, Darren Gergle, and James J. Higgins. 2011. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only Anova Procedures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI ’11). ACM, New York, NY, USA, 143–146. https://doi.org/10.1145/1978942.1978963
[62]
Wenge Xu, Hai-Ning Liang, Yuxuan Zhao, Difeng Yu, and Diego Monteiro. 2019. DMove: Directional Motion-Based Interaction for Augmented Reality Head-Mounted Displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, Article 444, 14 pages. https://doi.org/10.1145/3290605.3300674
[63]
Lijing Yao and C. K. Peck. 1997. Saccadic eye movements to visual and auditory targets. Experimental brain research 115 (06 1997), 25–34. https://doi.org/10.1007/PL00005682
[64]
Difeng Yu, Hai-Ning Liang, Xueshi Lu, Kaixuan Fan, and Barrett Ens. 2019. Modeling Endpoint Distribution of Pointing Selection Tasks in Virtual Reality Environments. ACM Trans. Graph. 38, 6, Article 218 (Nov. 2019), 13 pages. https://doi.org/10.1145/3355089.3356544
[65]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Pittsburgh, Pennsylvania, USA) (CHI ’99). Association for Computing Machinery, New York, NY, USA, 246–253. https://doi.org/10.1145/302979.303053
[66]
Xinyong Zhang, Pianpian Xu, Qing Zhang, and Hongbin Zha. 2011. Speed-Accuracy Trade-off in Dwell-Based Eye Pointing Tasks at Different Cognitive Levels. In Proceedings of the 1st International Workshop on Pervasive Eye Tracking & Mobile Eye-Based Interaction (Beijing, China) (PETMEI ’11). Association for Computing Machinery, New York, NY, USA, 37–42. https://doi.org/10.1145/2029956.2029967
[67]
Xiaoyu (Amy) Zhao, Elias D. Guestrin, Dimitry Sayenko, Tyler Simpson, Michel Gauthier, and Milos R. Popovic. 2012. Typing with Eye-Gaze and Tooth-Clicks. In Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, California) (ETRA ’12). Association for Computing Machinery, New York, NY, USA, 341–344. https://doi.org/10.1145/2168556.2168632

Cited By

View all
  • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Body Language for VUIs: Exploring Gestures to Enhance Interactions with Voice User InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660691(133-150)Online publication date: 1-Jul-2024
  • (2024)Snap, Pursuit and Gain: Virtual Reality Viewport Control by GazeProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642838(1-14)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. Kuiper Belt: Utilizing the “Out-of-natural Angle” Region in the Eye-gaze Interaction for Virtual Reality

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
      April 2022
      10459 pages
      ISBN:9781450391573
      DOI:10.1145/3491102
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 April 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Eye Tracking
      2. Eye-gaze Interface
      3. Menu Item Selection
      4. Virtual Reality

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      CHI '22
      Sponsor:
      CHI '22: CHI Conference on Human Factors in Computing Systems
      April 29 - May 5, 2022
      LA, New Orleans, USA

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)390
      • Downloads (Last 6 weeks)56
      Reflects downloads up to 15 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
      • (2024)Body Language for VUIs: Exploring Gestures to Enhance Interactions with Voice User InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660691(133-150)Online publication date: 1-Jul-2024
      • (2024)Snap, Pursuit and Gain: Virtual Reality Viewport Control by GazeProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642838(1-14)Online publication date: 11-May-2024
      • (2024)EmojiChat: Toward Designing Emoji-Driven Social Interaction in VR MuseumsInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2387902(1-17)Online publication date: 16-Aug-2024
      • (2023)Exploring Dwell-time from Human Cognitive Processes for Dwell SelectionProceedings of the ACM on Human-Computer Interaction10.1145/35911287:ETRA(1-15)Online publication date: 18-May-2023
      • (2023)Leap to the Eye: Implicit Gaze-based Interaction to Reveal Invisible Objects for Virtual Environment Exploration2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00036(214-222)Online publication date: 16-Oct-2023
      • (2023)DVGaze: Dual-View Gaze Estimation2023 IEEE/CVF International Conference on Computer Vision (ICCV)10.1109/ICCV51070.2023.01886(20575-20584)Online publication date: 1-Oct-2023
      • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media