Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2638728.2641692acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Gaze and mouse coordination in everyday work

Published: 13 September 2014 Publication History

Abstract

Gaze tracking technology is increasingly common in desktop, laptop and mobile scenarios. Most previous research on eye gaze patterns during human-computer interaction has been confined to controlled laboratory studies. In this paper we present an in situ study of gaze and mouse coordination as participants went about their normal activities. We analyze the coordination between gaze and mouse, showing that gaze often leads the mouse, but not as much as previously reported, and in ways that depend on the type of target. Characterizing the relationship between the eyes and mouse in realistic multi-task settings highlights some new challenges we face in designing robust gaze-enhanced interaction techniques.

References

[1]
Aaltonen, A., Hyrskykari, A., and Raiha, K.-J. 101 spots, or how do users read menus? Proc. CHI 1998.
[2]
Bieg, H.-J., Reiterer, H., and Bülthoff, H. H. Eye and pointer coordination in search and selection tasks. Proc. ETRA 2010.
[3]
Byrne, M. D., Anderson, J. R., Douglass, S., and Matessa, M. Eye tracking the visual search of click-down menus. Proc. CHI 1999.
[4]
Chen, M.-C., Anderson, J. R., and Sohn, M.-H. What can a mouse cursor tell us more? Correlation of eye/mouse movements on web browsing. Proc. CHI 2001 Extended Abstracts.
[5]
Diaz, F., White, R., Liebling, D., and Buscher, G. Robust models of mouse movement on dynamic web search results pages. Proc. CIKM 2013.
[6]
Duchowski, A. Eye Tracking Methodology: Theory and Practice. London: Springer-Verlag, 2007.
[7]
Helsen, W. F., Elliott, D., Starkes, J. L., and Ricker, K. L. Temporal and spatial coupling of point of gaze and hand movement in aiming. Journal of Motor Behavior 30 (3), 1998, 249--259.
[8]
Holmqvist, K., Nystrom, M., Andersson, R., Dewhurst, R., Jarodzka, H., and van der Weijer, J. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, 2011.
[9]
Hornof, A., and Halverson, T. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior Research Methods, Instruments & Computers 34 (4), 2002, 592--604.
[10]
Huang, J., White, R., and Buscher, G. User see, user point: Gaze and cursor alignment in web search. Proc. CHI 2012.
[11]
Jacob, R. J. K. What you look at is what you get: Eye movement-based interaction techniques. Proc. CHI 1990.
[12]
Kumar, M., and Winograd, T. Gaze-enhanced scrolling techniques. Proc. UIST 2003.
[13]
Levoy, M., and Whittaker, R. Gaze-directed volume rendering. Proc. SIGGRAPH 1990.
[14]
Matejka, J., Grossman, T., and Fitzmaurice, G. Patina: Dynamic heatmaps for visualizing application usage. Proc. CHI 2013.
[15]
Rodden, K., Fu, X. Aula, A., and Spiro, I. Eye-mouse coordination patterns on web search results pages. Proc. CHI 2008 Extended Abstracts.
[16]
Smith, B., Ho, J., Ark, W., and Zhai, S. Hand eye coordination patterns in target selection. Proc. ETRA 2000.
[17]
Stampe, D. M. Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems. Behavior Research Methods, Instruments & Computers 25 (2), 1993, 137--142.
[18]
Tobii Technology AB. Accuracy and precision Test report X2-30 fw 1.0.1. 28 May 2013.
[19]
Vrzakova, H., and Bednarik, R. EyeCloud: Cloud Computing for pervasive eye-tracking. Proc. PETMEI 2013.
[20]
Vrzakova, H., and Bednarik, R. That's not normal: A detailed analysis of Midas touch in gaze-based problem-solving. Proc. CHI 2013 Extended Abstracts.
[21]
Wooding, D. S. Fixation maps: Quantifying eye-movement traces. Proc. ETRA 2002.
[22]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. Proc. CHI 1999.

Cited By

View all
  • (2023)Mouse Tracking as a Method for Examining the Perception and Cognition of Digital MapsDigital10.3390/digital30200093:2(127-136)Online publication date: 30-May-2023
  • (2023)Integrating Gaze and Mouse Via Joint Cross-Attention Fusion Net for Students' Activity Recognition in E-learningProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108767:3(1-35)Online publication date: 27-Sep-2023
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • Show More Cited By

Index Terms

  1. Gaze and mouse coordination in everyday work

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '14 Adjunct: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication
    September 2014
    1409 pages
    ISBN:9781450330473
    DOI:10.1145/2638728
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 September 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze tracking
    2. mouse
    3. multimodal input
    4. target acquisition

    Qualifiers

    • Research-article

    Conference

    UbiComp '14
    UbiComp '14: The 2014 ACM Conference on Ubiquitous Computing
    September 13 - 17, 2014
    Washington, Seattle

    Acceptance Rates

    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Upcoming Conference

    UbiComp '24

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)26
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 30 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Mouse Tracking as a Method for Examining the Perception and Cognition of Digital MapsDigital10.3390/digital30200093:2(127-136)Online publication date: 30-May-2023
    • (2023)Integrating Gaze and Mouse Via Joint Cross-Attention Fusion Net for Students' Activity Recognition in E-learningProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108767:3(1-35)Online publication date: 27-Sep-2023
    • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
    • (2023)Dynamics of eye-hand coordination are flexibly preserved in eye-cursor coordination during an online, digital, object interaction taskProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580866(1-13)Online publication date: 19-Apr-2023
    • (2023)Is your mouse attracted by your eyesEngineering Applications of Artificial Intelligence10.1016/j.engappai.2023.106495123:PCOnline publication date: 1-Aug-2023
    • (2021)GAZEL: Runtime Gaze Tracking for Smartphones2021 IEEE International Conference on Pervasive Computing and Communications (PerCom)10.1109/PERCOM50583.2021.9439113(1-10)Online publication date: 22-Mar-2021
    • (2020)Voice as a Mouse Click: Usability and Effectiveness of Simplified Hands-Free Gaze-Voice SelectionApplied Sciences10.3390/app1024879110:24(8791)Online publication date: 9-Dec-2020
    • (2020)Identifying users based on their eye tracker calibration dataACM Symposium on Eye Tracking Research and Applications10.1145/3379157.3391419(1-2)Online publication date: 2-Jun-2020
    • (2020)Protecting from Lunchtime Attack Using an Uncalibrated Eye Tracker SignalACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391348(1-5)Online publication date: 2-Jun-2020
    • (2020)Predicting Human Errors from Gaze and Cursor Movements2020 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN48605.2020.9207189(1-8)Online publication date: Jul-2020
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media