Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3206505.3206522acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

VRpursuits: interaction in virtual reality using smooth pursuit eye movements

Published: 29 May 2018 Publication History

Abstract

Gaze-based interaction using smooth pursuit eye movements (Pursuits) is attractive given that it is intuitive and overcomes the Midas touch problem. At the same time, eye tracking is becoming increasingly popular for VR applications. While Pursuits was shown to be effective in several interaction contexts, it was never explored in-depth for VR before. In a user study (N=26), we investigated how parameters that are specific to VR settings influence the performance of Pursuits. For example, we found that Pursuits is robust against different sizes of virtual 3D targets. However performance improves when the trajectory size (e.g., radius) is larger, particularly if the user is walking while interacting. While walking, selecting moving targets via Pursuits is generally feasible albeit less accurate than when stationary. Finally, we discuss the implications of these findings and the potential of smooth pursuits for interaction in VR by demonstrating two sample use cases: 1) gaze-based authentication in VR, and 2) a space meteors shooting game.

References

[1]
Sean Andrist, Michael Gleicher, and Bilge Mutlu. 2017. Looking Coordinated: Bidirectional Gaze Mechanisms for Collaborative Interaction with Virtual Characters. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 2571--2582.
[2]
Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7(4):1 (2014), 1--11.
[3]
Murtaza Dhuliawala, Juyoung Lee, Junichi Shimizu, Andreas Bulling, Kai Kunze, Thad Starner, and Woontack Woo. 2016. Smooth Eye Movement Interaction Using EOG Glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI 2016). ACM, New York, NY, USA, 307--311.
[4]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze Interaction for Mobile Phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology (Mobility '07). ACM, New York, NY, USA, 364--371.
[5]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 457--466.
[6]
Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software & Technology (UIST '17). ACM, New York, NY, USA, 12.
[7]
Helmut Flitter, Thies Pfeiffer, and Gert Rickheit. 2006. Psycholinguistic experiments on spatial relations using stereoscopic presentation. In Situated Communication, Gert Rickheit and Ipke Wachsmuth (Eds.). De Gruyter, Chapter 10, 128--153.
[8]
Florian Geiselhart, Michael Rietzler, and Enrico Rukzio. 2016. EyeVR: Low-cost VR Eye-based Interaction. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, New York, NY, USA, 277--280.
[9]
Ceenu Goerge, Mohamed Khamis, Emanuel von Zezschwitz, Marinus Burger, Henri Schmidt, Florian Alt, and Heinrich Hussmann. 2017. Seamless and Secure VR: Adapting and Evaluating Established Authentication Systems for Virtual Reality. In Proceedings of the Network and Distributed System Security Symposium (NDSS 2017) (USEC '17). NDSS.
[10]
Khalad Hasan, Tovi Grossman, and Pourang Irani. 2011. Comet and Target Ghost: Techniques for Selecting Moving Targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 839--848.
[11]
Konrad Hinsen. 2000. The molecular modeling toolkit: A new approach to molecular simulations. Journal of Computational Chemistry 21, 2 (2000), 79--85. <79::AID-JCC1>3.0.CO;2-B
[12]
Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152--169.
[13]
Richard J. Jagacinski, Daniel W. Repperger, Sharon L. Ward, and Martin S. Moran. 1980. A Test of Fitts' Law with Moving Targets. Human Factors 22, 2 (1980), 225--233. arXiv: 7390506
[14]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 1151--1160.
[15]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A Field Study on Spontaneous Gaze-based Interaction with a Public Display using Pursuits. In Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015) (2015-09-07). 865--874.
[16]
Mohamed Khamis, Alexander Klimczak, Martin Reiss, Florian Alt, and Andreas Bulling. 2017. EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays. In Proceedings of the 30th Annual ACM Symposium on User Interface Software & Technology (UIST '17). ACM, New York, NY, USA, 12.
[17]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, New York, NY, USA, 12.
[18]
Mohamed Khamis, Ludwig Trotter, Markus Tessmann, Christina Dannhart, Andreas Bulling, and Florian Alt. 2016. EyeVote in the Wild: Do Users Bother Correcting System Errors on Public Displays?. In Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia (MUM '16). ACM, New York, NY, USA, 57--62.
[19]
Fu-Shun Li and Sai-Keung Wong. 2016. Animating Agents Based on Radial View in Crowd Simulation. In Proceedings of the 22Nd ACM Conference on Virtual Reality Software and Technology (VRST '16). ACM, New York, NY, USA, 101--109.
[20]
Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8(1):2 (2015), 1--11.
[21]
David Mould and Carl Gutwin. 2004. The Effects of Feedback on Targeting with Multiple Moving Targets. In Proceedings of Graphics Interface 2004 (GI '04). Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo, Ontario, Canada, 25--32. htttp://dl.acm.org/citation.cfm?id=1006058.1006062
[22]
Sahil Narang, Andrew Best, Tanmay Randhavane, Ari Shapiro, and Dinesh Manocha. 2016. PedVR: Simulating Gaze-based Interactions Between a Real User and Virtual Crowds. In Proceedings of the 22Nd ACM Conference on Virtual Reality Software and Technology (VRST '16). ACM, New York, NY, USA, 91--100.
[23]
Michael Ortega. 2013. Hook: Heuristics for selecting 3D moving objects in dense target environments. In 2013 IEEE Symposium on 3D User Interfaces (3DUI). 119--122.
[24]
Yun Suen Pai, Benjamin I. Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, and Kai Kunze. 2017. GazeSphere: Navigating 360-degree-video Environments in VR Using Head Rotation and Eye Gaze. In ACM SIGGRAPH 2017 Posters (SIGGRAPH '17). ACM, New York, NY, USA, Article 23, 2 pages.
[25]
Anjul Patney, Marco Salvi, Joohwan Kim, Anton Kaplanyan, Chris Wyman, Nir Benty, David Luebke, and Aaron Lefohn. 2016. Towards Foveated Rendering for Gaze-tracked Virtual Reality. ACM Trans. Graph. 35, 6, Article 179 (Nov. 2016), 12 pages.
[26]
Thies Pfeiffer, Marc Erich Latoschik, and Ipke Wachsmuth. 2008. Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. JVRB - Journal of Virtual Reality and Broadcasting 5, 16 (2008), 1660.
[27]
Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). 36--39.
[28]
Daniel Pohl, Xucong Zhang, and Andreas Bulling. 2016. Combining Eye Tracking with Optimizations for Lens Astigmatism in Modern Wide-Angle HMDs. In Proc. of the IEEE Conference on Virtual Reality (VR) (2016-03-19). https://perceptual.mpi-inf.mpg.de/files/2016/01/Pohl16_VR.pdf
[29]
Sharif Razzaque. 2001. Redirected Walking. In Proceedings of Eurographics, pp 289-294. ACM Press.
[30]
Martin Schrepp, Andreas Hinderks, and Jorg Thomaschewski. 2017. Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S). International Journal of Interactive Multimedia and Artificial Intelligence 4, 6 (12/2017 2017), 103--108.
[31]
Sophie Stellmach and Raimund Dachselt. 2012. Designing Gaze-based User Interfaces for Steering in Virtual Environments. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 131--138.
[32]
Sam Tregillus and Eelke Folmer. 2016. VR-STEP: Walking-in-Place Using Inertial Sensing for Hands Free Navigation in Mobile VR Environments. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1250--1255.
[33]
Subarna Tripathi and Brian Guenter. 2017. A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems. In 2017 IEEE Winter Conference on Applications of Computer Vision (WACV). 862--870.
[34]
Eduardo Velloso, Marcus Carter, Joshua Newn, Augusto Esteves, Christopher Clarke, and Hans Gellersen. 2017. Motion Correlation: Selecting Objects by Matching Their Movement. ACM Trans. Comput.-Hum. Interact. 24, 3, Article 22 (April 2017), 35 pages.
[35]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS '16). ACM, New York, NY, USA, 812--817.
[36]
Mélodie Vidal, Remi Bismuth, Andreas Bulling, and Hans Gellersen. 2015. The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 115--124.
[37]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[38]
Bartholomäus Wissmath, Daniel Stricker, David Weibel, Eva Siegenthaler, and Fred W. Mast. 2010. The Illusion of Being Located in Dynamic Virtual Environments. Can Eye Movement Parameters Predict Spatial Presence? Journal of Eye Movement Research 3, 5 (2010). https://bop.unibe.ch/index.php/JEMR/article/view/2307

Cited By

View all
  • (2024)Recent Trends of Authentication Methods in Extended Reality: A SurveyApplied System Innovation10.3390/asi70300457:3(45)Online publication date: 28-May-2024
  • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
  • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2313888(1-13)Online publication date: 15-Feb-2024
  • Show More Cited By

Index Terms

  1. VRpursuits: interaction in virtual reality using smooth pursuit eye movements

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    AVI '18: Proceedings of the 2018 International Conference on Advanced Visual Interfaces
    May 2018
    430 pages
    ISBN:9781450356169
    DOI:10.1145/3206505
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 May 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze interaction
    3. pursuits
    4. virtual reality

    Qualifiers

    • Research-article

    Funding Sources

    • Bavarian State Ministry of Education, Science and the Arts
    • Cluster of Excellence on Multimodal Computing and Interaction (MMCI) at Saarland University, Germany

    Conference

    AVI '18
    AVI '18: 2018 International Conference on Advanced Visual Interfaces
    May 29 - June 1, 2018
    Grosseto, Castiglione della Pescaia, Italy

    Acceptance Rates

    AVI '18 Paper Acceptance Rate 19 of 77 submissions, 25%;
    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)194
    • Downloads (Last 6 weeks)15
    Reflects downloads up to 01 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Recent Trends of Authentication Methods in Extended Reality: A SurveyApplied System Innovation10.3390/asi70300457:3(45)Online publication date: 28-May-2024
    • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
    • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2313888(1-13)Online publication date: 15-Feb-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-4Online publication date: 15-Jun-2024
    • (2024)Eye Tracking in Virtual RealityEncyclopedia of Computer Graphics and Games10.1007/978-3-031-23161-2_170(681-688)Online publication date: 5-Jan-2024
    • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023
    • (2023)Comparison of a VR Stylus with a Controller, Hand Tracking, and a Mouse for Object Manipulation and Medical Marking Tasks in Virtual RealityApplied Sciences10.3390/app1304225113:4(2251)Online publication date: 9-Feb-2023
    • (2023)ANALYZING THE EFFECTIVENESS OF VERBAL-VISUAL LEARNING STYLE RATING (VVLSR) USING EYE-TRACKING TECHNOLOGYJournal of Flow Visualization and Image Processing10.1615/JFlowVisImageProc.v30.i4.3030:4(47-65)Online publication date: 2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media