Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

Published: 20 May 2021 Publication History

Abstract

Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset’s lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system’s accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset’s field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.

References

[1]
[n.d.]. OpenCV Library. Retrieved May 1, 2020 from https://opencv.org/.
[2]
[n.d.]. XCode. Retrieved September 9, 2019 from https://developer.apple.com/xcode/.
[3]
Karan Ahuja, Rahul Islam, Varun Parashar, Kuntal Dey, Chris Harrison, and Mayank Goel. 2018. EyeSpyVR: Interactive eye sensing using off-the-shelf, smartphone-based VR headsets. Proc. ACM Interact. Mobile Wearable Ubiq. Technol. 2, 2 (2018), 57.
[4]
Robert W. Baloh, Andrew W. Sills, Warren E. Kumley, and Vicente Honrubia. 1975. Quantitative measurement of saccade amplitude, duration, and velocity. Neurology 25, 11 (1975), 1065–1065.
[5]
Gary Bradski and Adrian Kaehler. 2008. Learning OpenCV: Computer Vision with the OpenCV Library. O’Reilly Media, Inc.
[6]
Juan J. Cerrolaza, Arantxa Villanueva, and Rafael Cabeza. 2012. Study of polynomial mapping functions in video-oculography eye trackers. ACM Trans. Comput.-Hum. Interact. 19, 2 (2012), 10.
[7]
Panagiotis Drakopoulos, George Alex Koulieris, and Katerina Mania. 2020. Front camera eye tracking for mobile VR. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW’20). IEEE, 643–644.
[8]
Andrew T. Duchowski. 2018. Gaze-based interaction: A 30 year retrospective. Comput. Graph. 73 (2018), 59–69.
[9]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI Conference Human Factors in Computing Systems. ACM, 1118–1130.
[10]
Martin A. Fischler and Robert C. Bolles. 1981. Random sample consensus: A paradigm for model fitting with applications to image analysis & automated cartography. Commun. ACM 24, 6 (1981), 381–395.
[11]
Andrew Fitzgibbon, Maurizio Pilu, and Robert B. Fisher. 1999. Direct least square fitting of ellipses. IEEE Trans. Pattern Anal. Mach. Intell. 21, 5 (1999), 476–480.
[12]
Wolfgang Fuhl, Thomas Kübler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Excuse: Robust pupil detection in real-world scenarios. In Proceedings of the International Conference on Computer Analysis of Images and Patterns. Springer, 39–51.
[13]
Wolfgang Fuhl, Thiago C. Santini, Thomas Kübler, and Enkelejda Kasneci. 2016. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the 9th Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 123–130.
[14]
Wenshuo Gao, Xiaoguang Zhang, Lei Yang, and Huizhong Liu. 2010. An improved Sobel edge detection. In Proceedings of the 2010 3rd International Conference on Computer Science and Information Technology, Vol. 5. IEEE, 67–71.
[15]
Scott W. Greenwald, Luke Loreti, Markus Funk, Ronen Zilberman, and Pattie Maes. 2016. Eye gaze tracking with google cardboard using purkinje images. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology. ACM, 19–22.
[16]
Hiroyuki Hakoda, Wataru Yamada, and Hiroyuki Manabe. 2017. Eye tracking using built-in camera for smartphone-based HMD. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST’17). Association for Computing Machinery, Inc, 15–16.
[17]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford.
[18]
Anthony J. Hornof and Tim Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav. Res. Methods Instrum. Comput. 34, 4 (2002), 592–604.
[19]
Michael Xuelin Huang, Jiajia Li, Grace Ngai, and Hong Va Leong. 2017. Screenglint: Practical, in-situ gaze estimation on smartphones. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2546–2557.
[20]
John Illingworth and Josef Kittler. 1988. A survey of the Hough transform. Comput. Vis. Graph. Image Process. 44, 1 (1988), 87–116.
[21]
R. Jakob. 1998. The use of eye movements in human-computer interaction techniques: What you look at is what you get. In Readings in Intelligent User Interfaces (1998), 65–83.
[22]
Amir-Homayoun Javadi, Zahra Hakimi, Morteza Barati, Vincent Walsh, and Lili Tcheang. 2015. SET: A pupil detection method using sinusoidal approximation. Front. Neuroeng. 8 (2015), 4.
[23]
Brendan John, Sanjeev Koppal, and Eakta Jain. 2019. EyeVEIL: Degrading iris authentication in eye tracking headsets. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–5.
[24]
Anuradha Kar and Peter Corcoran. 2017. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5 (2017), 16495–16519.
[25]
Pawel Kasprowski, Katarzyna Hareżlak, and Mateusz Stasch. 2014. Guidelines for the eye tracker calibration using points of regard. In Information Technologies in Biomedicine, Vol. 4. Springer, 225–236.
[26]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1151–1160.
[27]
Dmytro Katrychuk, Henry K. Griffith, and Oleg V. Komogortsev. 2019. Power-efficient and shift-robust eye-tracking sensor for portable VR headsets. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–8.
[28]
George Alex Koulieris, Kaan Akşit, Michael Stengel, R. K. Mantiuk, Katerina Mania, and Christian Richardt. 2019. Near-eye display and tracking technologies for virtual and augmented reality. In Computer Graphics Forum, Vol. 38. 493–519.
[29]
George Alex Koulieris, George Drettakis, Douglas Cunningham, and Katerina Mania. 2016. Gaze prediction using machine learning for dynamic stereo manipulation in games. In Proceedings of the 2016 IEEE Virtual Reality (VR’16). IEEE, 113–120.
[30]
Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2176–2184.
[31]
Dongheng Li, David Winfield, and Derrick J. Parkhurst. 2005. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In Proceedings of the 2005 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition Workshops (CVPR’05). 79–79.
[32]
Tianxing Li, Qiang Liu, and Xia Zhou. 2017. Ultra-low power gaze tracking for virtual reality. In Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems. 1–14.
[33]
Denis Pelisson and Alain Guillaume. 2009. Eye-Head Coordination. Springer, Berlin, 1545–1548.
[34]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018. PuRe: Robust pupil detection for real-time pervasive eye tracking. Comput. Vis. Image Understand. 170 (2018), 40–50.
[35]
Nikolaos Sidorakis, George Alex Koulieris, and Katerina Mania. 2015. Binocular eye-tracking for the control of a 3D immersive multimedia user interface. In Proceedings of the 2015 IEEE 1st Workshop on Everyday Virtual Reality (WEVR’15). 15–18.
[36]
Alexandra Sipatchin, Siegfried Wahl, and Katharina Rifai. 2020. Accuracy and precision of the HTC VIVE PRO eye tracking in head-restrained and head-free conditions. Invest. Ophthalmol. Vis. Sci. 61, 7 (2020), 5071–5071.
[37]
Alexandra Sipatchin, Siegfried Wahl, and Katharina Rifai. 2020. Eye-tracking for low vision with virtual reality (VR): Testing status quo usability of the HTC Vive Pro Eye. bioRxiv (2020).
[38]
Dave M. Stampe. 1993. Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems. Behav. Res. Methods Instrum. Comput. 25, 2 (1993), 137–142.
[39]
Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 173–176.
[40]
Tsuji and Matsumoto. 1978. Detection of ellipses by a modified Hough transformation. IEEE Trans. Comput. C-27, 8 (1978), 777–781.
[41]
Erroll Wood and Andreas Bulling. 2014. Eyetab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications. 207–210.
[42]
Zhiwei Zhu and Qiang Ji. 2004. Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl. 15, 3 (2004), 139–148.
[43]
Zhiwei Zhu, Qiang Ji, and Kristin P Bennett. 2006. Nonlinear eye gaze mapping function estimation via support vector regression. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR’06), Vol. 1. IEEE, 1132–1135.
[44]
Karel Zuiderveld. 1994. Contrast limited adaptive histogram equalization. In Graphics Gems IV. Academic Press Professional, 474–485.

Cited By

View all
  • (2024)A review on personal calibration issues for video-oculographic-based gaze trackingFrontiers in Psychology10.3389/fpsyg.2024.130904715Online publication date: 20-Mar-2024
  • (2024)Best low-cost methods for real-time detection of the eye and gaze trackingi-com10.1515/icom-2023-002623:1(79-94)Online publication date: 8-Jan-2024
  • (2024)Improving Inclusion of Virtual Reality Through Enhancing Interactions in Low-Fidelity VR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00061(305-310)Online publication date: 16-Mar-2024
  • Show More Cited By

Index Terms

  1. Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Applied Perception
    ACM Transactions on Applied Perception  Volume 18, Issue 3
    July 2021
    148 pages
    ISSN:1544-3558
    EISSN:1544-3965
    DOI:10.1145/3467015
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 May 2021
    Accepted: 01 February 2021
    Revised: 01 February 2021
    Received: 01 July 2020
    Published in TAP Volume 18, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Mobile VR
    2. eye tracking

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)104
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 03 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A review on personal calibration issues for video-oculographic-based gaze trackingFrontiers in Psychology10.3389/fpsyg.2024.130904715Online publication date: 20-Mar-2024
    • (2024)Best low-cost methods for real-time detection of the eye and gaze trackingi-com10.1515/icom-2023-002623:1(79-94)Online publication date: 8-Jan-2024
    • (2024)Improving Inclusion of Virtual Reality Through Enhancing Interactions in Low-Fidelity VR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00061(305-310)Online publication date: 16-Mar-2024
    • (2024)Exploring loyalty drivers for smartphone and mobile carriersHumanities and Social Sciences Communications10.1057/s41599-024-03371-011:1Online publication date: 29-Jun-2024
    • (2024)Comparative study of interaction methods for mobile gaming while running on a treadmillComputers and Graphics10.1016/j.cag.2023.10.020117:C(164-171)Online publication date: 4-Mar-2024
    • (2024)Eye-tracking on virtual reality: a surveyVirtual Reality10.1007/s10055-023-00903-y28:1Online publication date: 5-Feb-2024
    • (2023)Enhancing Online Learning Monitoring with Novel Image Recognition Method Using Dlib for Eye Feature Detection2023 12th International Conference on Awareness Science and Technology (iCAST)10.1109/iCAST57874.2023.10359300(340-345)Online publication date: 9-Nov-2023
    • (2023)Cross-Device Augmented Reality Systems for Fire and Rescue based on Thermal Imaging and Live Tracking2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00018(50-54)Online publication date: 16-Oct-2023
    • (2023)Predicting Future Eye Gaze Using Inertial SensorsIEEE Access10.1109/ACCESS.2023.329241111(67482-67497)Online publication date: 2023
    • (2023)An Improved Cross-Ratio Based Gaze Estimation Method Using Weighted Average and Polynomial CompensationIEEE Access10.1109/ACCESS.2023.323412011(2410-2423)Online publication date: 2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media