Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1344471.1344488acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
poster

Improving the accuracy of gaze input for interaction

Published: 26 March 2008 Publication History

Abstract

Using gaze information as a form of input poses challenges based on the nature of eye movements and how we humans use our eyes in conjunction with other motor actions. In this paper, we present three techniques for improving the use of gaze as a form of input. We first present a saccade detection and smoothing algorithm that works on real-time streaming gaze information. We then present a study which explores some of the timing issues of using gaze in conjunction with a trigger (key press or other motor action) and propose a solution for resolving these issues. Finally, we present the concept of Focus Points, which makes it easier for users to focus their gaze when using gaze-based interaction techniques. Though these techniques were developed for improving the performance of gaze-based pointing, their use is applicable in general to using gaze as a practical form of input.

References

[1]
Arulampalam, M. S., Maskell, S., Gordon, N., and Clapp, T. 2002. A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking. IEEE Transactions on Signal Processing. 50(2): p. 174.
[2]
Beinhauer, W. 2006. A Widget Library for Gaze-based Interaction Elements, in ETRA: Eye Tracking Research and Applications Symposium. San Diego, California, USA: ACM Press. p. 53--53.
[3]
Kumar, M. 2007. GUIDe Saccade Detection and Smoothing Algorithm. Technical Report CSTR 2007-03, Stanford University
[4]
Kumar, M., Garfinkel, T., Boneh, D., and Winograd, T. 2007a. Reducing Shoulder-surfing by Using Gaze-based Password Entry. Technical Report CSTR 2007-05, Stanford University
[5]
Kumar, M., Paepcke, A., and Winograd, T. 2007b. Eye-Point: Practical Pointing and Selection Using Gaze and Key-board, in CHI. San Jose, California, USA: ACM Press.
[6]
Kumar, M., Winograd, T., and Paepcke, A. 2007c. Gaze-enhanced Scrolling Techniques, in CHI. San Jose, California, USA: ACM Press.
[7]
Majaranta, P., Aula, A., and Raihä, K.-J. 2004. Effects of Feedback on Eye Typing with a Short Dwell Time, in ETRA: Eye Tracking Research & Applications Symposium. San Antonio, Texas, USA: ACM Press. p. 139--146.
[8]
Majaranta, P., MacKenzie, I. S., Aula, A., and Räihä, K.-J. 2003. Auditory and Visual Feedback During Eye Typing, in CHI. Ft. Lauderdale, Florida, USA: ACM Press. p. 766--767.
[9]
Monty, R. A., Senders, J. W., and Fisher, D. F. Eye Movements and the Higher Psychological Functions. 1978, Hillsdale, New Jersey, USA: Erlbaum.
[10]
Salvucci, D. D. 1999. Inferring Intent in Eye-Based Interfaces: Tracing Eye Movements with Process Models, in CHI. Pittsburgh, Pennsylvania, USA: ACM Press. p. 254--261.
[11]
Salvucci, D. D. and Goldberg, J. H. 2000. Identifying Fixations and Saccades in Eye-Tracking Protocols, in ETRA: Eye Tracking Research & Applications Symposium. Palm Bech Gardens, Florida, USA: ACM Press. p. 71--78.
[12]
Tobii Technology, AB. 2006 Tobii 1750 Eye Tracker. http://www.tobii.com.
[13]
Yarbus, A. L. Eye Movements and Vision. 1967, New York: Plenum Press.
[14]
Zhai, S. 2003. What's in the Eyes for Attentive Input, in Communications of the ACM, Vol. 46, (3)

Cited By

View all
  • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
  • (2024)The Impact of Gaze and Hand Gesture Complexity on Gaze-Pinch Interaction PerformancesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678990(622-626)Online publication date: 5-Oct-2024
  • (2024)Shifting Focus with HCEye: Exploring the Dynamics of Visual Highlighting and Cognitive Load on User Attention and Saliency PredictionProceedings of the ACM on Human-Computer Interaction10.1145/36556108:ETRA(1-18)Online publication date: 28-May-2024
  • Show More Cited By

Index Terms

  1. Improving the accuracy of gaze input for interaction

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applications
    March 2008
    285 pages
    ISBN:9781595939821
    DOI:10.1145/1344471
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 March 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. eye-hand coordination
    3. fixation smoothing
    4. focus points
    5. gaze input
    6. gaze-enhanced user interface design

    Qualifiers

    • Poster

    Conference

    ETRA '08
    ETRA '08: Eye Tracking Research and Applications
    March 26 - 28, 2008
    Georgia, Savannah

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)96
    • Downloads (Last 6 weeks)16
    Reflects downloads up to 09 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Comparison of Unencumbered Interaction Technique for Head-Mounted DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/36981468:ISS(500-516)Online publication date: 24-Oct-2024
    • (2024)The Impact of Gaze and Hand Gesture Complexity on Gaze-Pinch Interaction PerformancesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678990(622-626)Online publication date: 5-Oct-2024
    • (2024)Shifting Focus with HCEye: Exploring the Dynamics of Visual Highlighting and Cognitive Load on User Attention and Saliency PredictionProceedings of the ACM on Human-Computer Interaction10.1145/36556108:ETRA(1-18)Online publication date: 28-May-2024
    • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
    • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
    • (2024)Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345615330:11(7234-7244)Online publication date: 1-Nov-2024
    • (2024)Design Principles and Challenges for Gaze + Pinch Interaction in XRIEEE Computer Graphics and Applications10.1109/MCG.2024.338296144:3(74-81)Online publication date: May-2024
    • (2024)Evaluating Target Expansion for Eye Pointing TasksInteracting with Computers10.1093/iwc/iwae00436:4(209-223)Online publication date: 27-Feb-2024
    • (2023)Integrating Gaze and Mouse Via Joint Cross-Attention Fusion Net for Students' Activity Recognition in E-learningProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108767:3(1-35)Online publication date: 27-Sep-2023
    • (2023)Exploring Gesture and Gaze Proxies to Communicate Instructor’s Nonverbal Cues in Lecture VideosExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585842(1-7)Online publication date: 19-Apr-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media