Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3379156.3391835acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality

Published: 02 June 2020 Publication History

Abstract

Gaze tracking in 3D has the potential to improve interaction with objects and visualizations in augmented reality. However, previous research showed that subjective perception of distance varies between real and virtual surroundings. We wanted to determine whether objectively measured 3D gaze depth through eye tracking also exhibits differences between entirely real and augmented environments. To this end, we conducted an experiment (N = 25) in which we used Microsoft HoloLens with a binocular eye tracking add-on from Pupil Labs. Participants performed a task that required them to look at stationary real and virtual objects while wearing a HoloLens device. We were not able to find significant differences in the gaze depth measured by eye tracking. Finally, we discuss our findings and their implications for gaze interaction in immersive analytics, and the quality of the collected gaze data.

References

[1]
W. Büschel, J. Chen, R. Dachselt, S. Drucker, T. Dwyer, C. Görg, T. Isenberg, A. Kerren, C. North, and W. Stuerzlinger. 2018. Interaction for immersive analytics. In Immersive Analytics. Springer International Publishing, 95–138.
[2]
D. Drascic and P. Milgram. 1996. Perceptual issues in augmented reality. In Stereoscopic Displays and Virtual Reality Systems III, Vol. 2653. International Society for Optics and Photonics, 123–134.
[3]
A. T. Duchowski, D. H. House, J. Gestring, R. Congdon, L. Świrski, N. A. Dodgson, K. Krejtz, and I. Krejtz. 2014. Comparing estimated gaze depth in virtual and physical environments. In Proceedings of the 8th ACM Symposium on Eye Tracking Research & Applications, ETRA 2014. ACM, 103–110.
[4]
A. T. Duchowski, B. Pelfrey, D. H. House, and R. Wang. 2011. Measuring gaze depth with an eye tracker during stereoscopic display. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization, APGV 2011. ACM, 15–22.
[5]
P. V. Johnson, J. AQ. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks. 2016. Dynamic lens and monovision 3D displays to improve viewer comfort. Optics Express 24, 11 (May 2016), 11808–11827.
[6]
J. W. Kelly, L. A. Cherep, and Z. D. Siegel. 2017. Perceived space in the HTC Vive. ACM Transactions on Applied Perception 15, 1, Article 2 (Jul 2017), 16 pages.
[7]
K. Kurzhals, B. D. Fisher, M. Burch, and D. Weiskopf. 2014. Evaluating visual analytics with eye tracking. In Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, BELIV 2014. ACM, 61–69.
[8]
Y. Lee, C. Shin, A. Plopski, Y. Itoh, T. Piumsomboon, A. Dey, G. Lee, S. Kim, and M. Billinghurst. 2017. Estimating gaze depth using multi-layer perceptron. In Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017. IEEE, 26–29.
[9]
K. Marriott, F. Schreiber, T. Dwyer, K. Klein, N. H. Riche, T. Itoh, W. Stuerzlinger, and B. H. Thomas. 2018. Immersive Analytics. Springer International Publishing.
[10]
Pupil Labs GmbH. 2020. Pupil Labs | Pupil Invisible - The world’s first deep learning powered eye tracking glasses. https://pupil-labs.com/products/invisible/. Accessed: 2020-02-17.
[11]
J. P. Rolland, W. Gibson, and D. Ariely. 1995. Towards quantifying depth and size perception in virtual environments. Presence: Teleoperators & Virtual Environments 4, 1 (Winter 1995), 24–49.
[12]
J. P. Rolland, C. Meyer, K. Arthur, and E. Rinalducci. 2002. Method of adjustments versus method of constant stimuli in the quantification of accuracy and precision of rendered depth in head-mounted displays. Presence: Teleoperators & Virtual Environments 11, 6 (Dec 2002), 610–625.
[13]
N. Silva, T. Blascheck, R. Jianu, N. Rodrigues, D. Weiskopf, M. Raubal, and T. Schreck. 2019. Eye tracking support for visual analytics systems: foundations, current applications, and research challenges. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA 2019. ACM, 11:1–11:10.
[14]
S. Stellmach and Microsoft Corporation. 2020. Eye tracking - Mixed Reality | Microsoft Docs. https://docs.microsoft.com/en-us/windows/mixed-reality/eye-tracking. Accessed: 2020-02-17.
[15]
J. E. Swan, G. Singh, and S. R. Ellis. 2015. Matching and reaching depth judgments with real and augmented reality targets. IEEE Transactions on Visualization and Computer Graphics 21, 11 (Nov 2015), 1289–1298.
[16]
Y. Wang, G. Zhai, S. Chen, X. Min, Z. Gao, and X. Song. 2019. Assessment of eye fatigue caused by head-mounted displays using eye-tracking. BioMedical Engineering OnLine 18, 1 (Nov 2019), 111.

Cited By

View all
  • (2024)Augmented Reality Interaction Based on Cross-Device Interaction Expansion2024 21st International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP)10.1109/ICCWAMTIP64812.2024.10873768(1-6)Online publication date: 14-Dec-2024
  • (2023)See or Hear? Exploring the Effect of Visual/Audio Hints and Gaze-assisted Instant Post-task Feedback for Visual Search Tasks in AR2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00128(1113-1122)Online publication date: 16-Oct-2023
  • (2023)Evaluating the Feasibility of Predicting Information Relevance During Sensemaking with Eye Gaze Data2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00086(713-722)Online publication date: 16-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
June 2020
305 pages
ISBN:9781450371346
DOI:10.1145/3379156
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Augmented reality
  2. depth perception
  3. eye tracking
  4. immersive analytics
  5. user study
  6. visualization

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • Deutsche Forschungsgemeinschaft

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)120
  • Downloads (Last 6 weeks)11
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Augmented Reality Interaction Based on Cross-Device Interaction Expansion2024 21st International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP)10.1109/ICCWAMTIP64812.2024.10873768(1-6)Online publication date: 14-Dec-2024
  • (2023)See or Hear? Exploring the Effect of Visual/Audio Hints and Gaze-assisted Instant Post-task Feedback for Visual Search Tasks in AR2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00128(1113-1122)Online publication date: 16-Oct-2023
  • (2023)Evaluating the Feasibility of Predicting Information Relevance During Sensemaking with Eye Gaze Data2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00086(713-722)Online publication date: 16-Oct-2023
  • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
  • (2021)Augmented Reality: Focusing on Photonics in Industry 4.0IEEE Journal of Selected Topics in Quantum Electronics10.1109/JSTQE.2021.309372127:6(1-11)Online publication date: Nov-2021
  • (2021)Adaptive Hüllen und StrukturenBautechnik10.1002/bate.20200010798:3(208-221)Online publication date: 25-Feb-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media