Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2557500.2557518acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays

Published: 24 February 2014 Publication History

Abstract

In this paper, we investigate the concept of gaze-based interaction with 3D user interfaces. We currently see stereo vision displays becoming ubiquitous, particularly as auto-stereoscopy enables the perception of 3D content without the use of glasses. As a result, application areas for 3D beyond entertainment in cinema or at home emerge, including work settings, mobile phones, public displays, and cars. At the same time, eye tracking is hitting the consumer market with low-cost devices. We envision eye trackers in the future to be integrated with consumer devices (laptops, mobile phones, displays), hence allowing the user's gaze to be analyzed and used as input for interactive applications. A particular challenge when applying this concept to 3D displays is that current eye trackers provide the gaze point in 2D only (x and y coordinates). In this paper, we compare the performance of two methods that use the eye's physiology for calculating the gaze point in 3D space, hence enabling gaze-based interaction with stereoscopic content. Furthermore, we provide a comparison of gaze interaction in 2D and 3D with regard to user experience and performance. Our results show that with current technology, eye tracking on stereoscopic displays is possible with similar performance as on standard 2D screens.

References

[1]
Alper, B., Höllerer, T., Kuchera-Morin, J., and Forbes, A. Stereoscopic highlighting: 2d graph visualization on stereo displays. IEEE Trans. Vis. Comput. Graph. 17, 12 (2011), 2325--2333.
[2]
Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. of GI'05, Canadian Human-Computer Communications Society (2005), 203--210.
[3]
Bangor, A., Kortum, P. T., and Miller, J. T. An Empirical Evaluation of the System Usability Scale. International Journal on Human Computer Interaction 24, 6 (2008), 574--594.
[4]
Benedek, J., and Miner, T. Measuring desirability: New methods for evaluating desirability in a usability lab setting. Proc. of Usability Professionals Association 2003 (2002), 8--12.
[5]
Broy, N., André, E., and Schmidt, A. Is stereoscopic 3d a better choice for information representation in the carfi In Proc. of AutoUI'12, ACM (New York, NY, USA, 2012), 93--100.
[6]
Duchowski, A. T., Pelfrey, B., House, D. H., and Wang, R. Measuring gaze depth with an eye tracker during stereoscopic display. In Proc. of SIGGRAPH'11, ACM (New York, NY, USA, 2011), 15--22.
[7]
Grossman, T., Wigdor, D., and Balakrishnan, R. Multi-finger gestural interaction with 3d volumetric displays. In Proc. of UIST'04, ACM (New York, NY, USA, 2004), 61--70.
[8]
Häkkilä, J., Posti, M., Koskenranta, O., and Ventä-Olkkonen, L. Design and evaluation of mobile phonebook application with stereoscopic 3d user interface. In CHIfi13 Extended Abstracts on Human Factors in Computing Systems, ACM (2013), 1389--1394.
[9]
Harrison, B. L., Ishii, H., Vicente, K. J., and Buxton, W. A. S. Transparent layered user interfaces: An evaluation of a display design to enhance focused and divided attention. In Proc. of CHI'95, ACM Press / AddisonWesley Publishing Co. (New York, NY, USA, 1995), 317--324.
[10]
Hart, S. G., and Staveland, L. E. Development of nasa-tlx (task load index): Results of empirical and theoretical research. Human mental workload 1, 3 (1988), 139--183.
[11]
Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. Holodesk: direct 3d interactions with a situated see-through display. In Proc. of CHI'12, ACM (New York, NY, USA, 2012), 2421--2430.
[12]
Huhtala, J., Karukka, M., Salmimaa, M., and Häkkilä, J. Evaluating depth illusion as method of adding emphasis in autostereoscopic mobile displays. In Proc. of MobileHCI'11 (New York, NY, USA, 2011), 357--360.
[13]
Johnson, R., OfiHara, K., Sellen, A., Cousins, C., and Criminisi, A. Exploring the potential for touchless interaction in image-guided interventional radiology. In Proc. of CHI'11, ACM (New York, NY, USA, 2011), 3323--3332.
[14]
Kern, D., Mahr, A., Castronovo, S., Schmidt, A., and Müller, C. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proc. of AutoUI'10, ACM (New York, NY, USA, 2010), 110--116.
[15]
Ki, J., and Kwon, Y.-M. 3d gaze estimation and interaction. In 3DTV Conference: The True Vision Capture, Transmission and Display of 3D Video, 2008 (2008), 373--376.
[16]
McIntire, J. P., Havig, P. R., and Geiselman, E. E. What is 3d good forfi a review of human performance on stereoscopic 3d displays. In Proceedings of Head- and Helmet-Mounted Displays XVII; and Display Technologies and Applications for Defense, Security, and Avionics VI, vol. 8383, SPIE (2012).
[17]
Ramasamy, C., House, D. H., Duchowski, A. T., and Daugherty, B. Using eye tracking to analyze stereoscopic filmmaking. In Adj. Proc. of SIGGRAPH'09: Posters, ACM (New York, NY, USA, 2009), 28:1--28:1.
[18]
Reichelt, S., Häussler, R., Fütterer, G., and Leister, N. Depth cues in human visual perception and their realization in 3d displays. In SPIE Defense, Security, and Sensing, International Society for Optics and Photonics (2010), 76900B--76900B.
[19]
Sunnari, M., Arhippainen, L., Pakanen, M., and Hickey, S. Studying user experiences of autostereoscopic 3d menu on touch screen mobile device. In Proc. of OzCHI'12, ACM (New York, NY, USA, 2012), 558--561.
[20]
Vidal, M., Bulling, A., and Gellersen, H. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of UbiComp'13, ACM (New York, NY, USA, 2013), 439--448.

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2023)Gaze Depth Estimation for In-vehicle AR DisplaysProceedings of the Augmented Humans International Conference 202310.1145/3582700.3583707(323-325)Online publication date: 12-Mar-2023
  • (2023)Affordance-Guided User Elicitation of Interaction Concepts for Unimodal Gaze Control of Potential Holographic 3D UIs in Automotive Applications2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00011(14-19)Online publication date: 16-Oct-2023
  • Show More Cited By

Index Terms

  1. Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IUI '14: Proceedings of the 19th international conference on Intelligent User Interfaces
    February 2014
    386 pages
    ISBN:9781450321846
    DOI:10.1145/2557500
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 February 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3d
    2. eye tracking
    3. gaze interaction
    4. stereoscopic displays

    Qualifiers

    • Research-article

    Conference

    IUI'14
    Sponsor:

    Acceptance Rates

    IUI '14 Paper Acceptance Rate 46 of 191 submissions, 24%;
    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Upcoming Conference

    IUI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)32
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 08 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
    • (2023)Gaze Depth Estimation for In-vehicle AR DisplaysProceedings of the Augmented Humans International Conference 202310.1145/3582700.3583707(323-325)Online publication date: 12-Mar-2023
    • (2023)Affordance-Guided User Elicitation of Interaction Concepts for Unimodal Gaze Control of Potential Holographic 3D UIs in Automotive Applications2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00011(14-19)Online publication date: 16-Oct-2023
    • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
    • (2022)Eye Positioning System for PC Based on Autostereoscopy with Android2022 7th International Conference on Multimedia Communication Technologies (ICMCT)10.1109/ICMCT57031.2022.00010(5-9)Online publication date: Jul-2022
    • (2020)Improved vergence and accommodation via Purkinje Image tracking with multiple cameras for AR glasses2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR50242.2020.00058(320-331)Online publication date: Nov-2020
    • (2019)Monocular gaze depth estimation using the vestibulo-ocular reflexProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319822(1-9)Online publication date: 25-Jun-2019
    • (2019)Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth EstimationProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300842(1-12)Online publication date: 2-May-2019
    • (2019)THE-3DI: Tracing head and eyes for 3D interactionsMultimedia Tools and Applications10.1007/s11042-019-08305-6Online publication date: 30-Oct-2019
    • (2018)A Bermuda Triangle?Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3174035(1-16)Online publication date: 21-Apr-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media