Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2634317.2634344acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Looking at or through?: using eye tracking to infer attention location for wearable transparent displays

Published: 13 September 2014 Publication History
  • Get Citation Alerts
  • Abstract

    Wearable near-eye displays pose interesting challenges for interface design. These devices present the user with a duality of visual worlds, with a virtual window of information overlaid onto the physical world. Because of this duality, we suggest that the wearable interface would benefit from understanding where the user's visual attention is directed. We explore the potential of eye tracking to address this problem, and describe four eye tracking techniques designed to provide data about where the user's attention is directed. We also propose some attention-aware user interface techniques demonstrating the potential of the eyes for wearable displays user interface management.

    Supplementary Material

    MOV File (p87-vidal.mov)

    References

    [1]
    Bell, B., Feiner, S., and Höllerer, T. View management for virtual and augmented reality. In Proc. of UIST (2001), 101--110.
    [2]
    Bulling, A., Roggen, D., and Tröster, G. What's in the eyes for context-awareness? Pervasive Computing, IEEE 10, 2 (2011), 48--57.
    [3]
    Campbell, C. S., and Maglio, P. P. A robust algorithm for reading detection. In Proc. of the 2001 Workshop on Perceptive User Interfaces (2001), 1--7.
    [4]
    Davies, T., and Beeharee, A. The case of the missed icon: change blindness on mobile devices. In Proc. of CHI (2012), 1451--1460.
    [5]
    Howard, I. Seeing in Depth. Vol. I: Basic Mechanisms. I Porteous, University of Toronto Press, 2002.
    [6]
    Järvenpää, T., and Äyräs, P. Highly integrated near-to-eye display and gaze tracker. Proc. of SPIE 7723 (2010), 77230Y--77230Y--10.
    [7]
    Kern, D., Marshall, P., and Schmidt, A. Gazemarks: gaze-based visual placeholders to ease attention switching. In Proc. of CHI (2010), 2093--2102.
    [8]
    Kumar, M., and Winograd, T. Gaze-enhanced scrolling techniques. In Proc. of UIST (2007), 213--216.
    [9]
    Lucero, A., Lyons, K., Vetek, A., Järvenpää, T., White, S., and Salmimaa, M. Exploring the interaction design space for interactive glasses. In Proc. of CHI EA (2013), 1341--1346.
    [10]
    Orlosky, J., Kiyokawa, K., and Takemura, H. Dynamic text management for see-through wearable and heads-up display systems. In Proc. of IUI (2013), 363--370.
    [11]
    Salvucci, D. D., and Goldberg, J. H. Identifying fixations and saccades in eye-tracking protocols. In Proc. of ETRA (2000), 71--78.
    [12]
    Starner, T. The challenges of wearable computing: Part 1 and 2. Micro, IEEE 21, 4 (2001), 44--67.
    [13]
    Toyama, T., Sonntag, D., Dengel, A., Matsuda, T., Iwamura, M., and Kise, K. A mixed reality head-mounted text translation system using eye gaze input. In Proc. of IUI (2014), 329--334.
    [14]
    Wooten, B. R., and Wald, G. Color-vision mechanisms in the peripheral retinas of normal and dichromatic observers. The Journal of General Physiology 61, 2 (1973), 125--145.

    Cited By

    View all
    • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
    • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
    • (2019)A Design Space for Gaze Interaction on Head-mounted DisplaysProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300855(1-12)Online publication date: 2-May-2019
    • Show More Cited By

    Index Terms

    1. Looking at or through?: using eye tracking to infer attention location for wearable transparent displays

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ISWC '14: Proceedings of the 2014 ACM International Symposium on Wearable Computers
      September 2014
      154 pages
      ISBN:9781450329699
      DOI:10.1145/2634317
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 13 September 2014

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. attention detection
      2. eye movements
      3. eye tracking
      4. head-mounted displays (HMD)
      5. user interface management

      Qualifiers

      • Research-article

      Conference

      UbiComp '14
      UbiComp '14: The 2014 ACM Conference on Ubiquitous Computing
      September 13 - 17, 2014
      Washington, Seattle

      Acceptance Rates

      Overall Acceptance Rate 38 of 196 submissions, 19%

      Upcoming Conference

      UBICOMP '24

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)19
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 12 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
      • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
      • (2019)A Design Space for Gaze Interaction on Head-mounted DisplaysProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300855(1-12)Online publication date: 2-May-2019
      • (2018)Towards a Symbiotic Human-Machine Depth SensorAdjunct Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology10.1145/3266037.3266119(114-116)Online publication date: 11-Oct-2018
      • (2018)AutopagerProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204556(1-5)Online publication date: 14-Jun-2018
      • (2017)Symbiotic attention management in the context of internet of thingsProceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers10.1145/3123024.3124559(941-946)Online publication date: 11-Sep-2017
      • (2017)Mindfulness and Asynchronous Neurofeedback: Coping with Mind WanderingUniversal Access in Human–Computer Interaction. Human and Technological Environments10.1007/978-3-319-58700-4_45(549-561)Online publication date: 17-May-2017
      • (2016)Reducing in-vehicle interaction complexityProceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia10.1145/3012709.3016064(311-313)Online publication date: 12-Dec-2016
      • (2016)The Emergence of EyePlayProceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play10.1145/2967934.2968084(171-185)Online publication date: 16-Oct-2016
      • (2016)On the VergeProceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems10.1145/2851581.2892307(1519-1525)Online publication date: 7-May-2016
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media