Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2807442.2807461acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative

Published: 05 November 2015 Publication History

Abstract

Eye gaze tracking is a promising input method which is gradually finding its way into the mainstream. An obvious question to arise is whether it can be used for point-and-click tasks, as an alternative for mouse or touch. Pointing with gaze is both fast and natural, although its accuracy is limited. There are still technical challenges with gaze tracking, as well as inherent physiological limitations. Furthermore, providing an alternative to clicking is challenging.
We are considering use cases where input based purely on gaze is desired, and the click targets are discrete user interface (UI) elements which are too small to be reliably resolved by gaze alone, e.g., links in hypertext. We present Actigaze, a new gaze-only click alternative which is fast and accurate for this scenario. A clickable user interface element is selected by dwelling on one of a set of confirm buttons, based on two main design contributions: First, the confirm buttons stay on fixed positions with easily distinguishable visual identifiers such as colors, enabling procedural learning of the confirm button position. Secondly, UI elements are associated with confirm buttons through the visual identifiers in a way which minimizes the likelihood of inadvertent clicks. We evaluate two variants of the proposed click alternative, comparing them against the mouse and another gaze-only click alternative.

Supplementary Material

suppl.mov (uist2152-file4.mp4)
Supplemental video
MP4 File (p385.mp4)

References

[1]
Abe, K., Owada, K., Ohi, S., and Ohyama, M. A system for web browsing by eye-gaze input. Electronics and Communications in Japan 91, 5 (2008), 11--18.
[2]
Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. Graphics Interface, Canadian HCI Society (2005), 203--210.
[3]
Blanch, R., and Ortega, M. Rake cursor: improving pointing performance with concurrent input channels. In Proc. CHI, ACM (2009), 1415--1418.
[4]
Brooke, J. SUS - A quick and dirty usability scale. Usability evaluation in industry 189 (1996), 194.
[5]
Castellina, E., and Corno, F. Accessible web surfing through gaze interaction. In Proc. Conference on Communication by Gaze Interaction (COGAIN) (2007).
[6]
Drewes, H., and Schmidt, A. Interacting with the computer using gaze gestures. In Proc. INTERACT. Springer, 2007, 475--488.
[7]
Grauman, K., Betke, M., Lombardi, J., Gips, J., and Bradski, G. Communication via eye blinks and eyebrow raises: video-based human-computer interfaces. Universal Access in the Information Society 2 (2003), 359--373.
[8]
Grossman, T., and Balakrishnan, R. The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor's activation area. In Proc. CHI, ACM (2005), 281--290.
[9]
--Hayhoe, M., and Ballard, D. Eye movements in natural behavior. Trends in Cognitive Sciences 9, 4 (2005), 188--194.
[10]
Healey, C. G. Choosing effective colours for data visualization. In Proc. Visualization, IEEE (1996), 263--270.
[11]
Huckauf, A., and Urbina, M. H. On object selection in gaze controlled environments. Journal of Eye Movement Research 2, 4 (2008), 4.
[12]
Huckauf, A., and Urbina, M. H. Object selection in gaze controlled systems: What you don't look at is what you get. ACM Trans. Appl. Percept. 8 (February 2011), 13:1--13:14.
[13]
Jacob, R. J. K. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9 (1991), 152--169.
[14]
Kumar, M., Paepcke, A., and Winograd, T. Eyepoint: practical pointing and selection using gaze and keyboard. In Proc. CHI, ACM (2007), 421--430.
[15]
Kumar, M., and Winograd, T. GUIDe: Gaze-enhanced UI design. In Proc. CHI, ACM (2007), 1977--1982.
[16]
Land, M. F. Eye movements and the control of actions in everyday life. Progress in Retinal and Eye Research 25, 3 (2006), 296--324.
[17]
Lankford, C. Effective eye-gaze input into windows. In Proc. Symposium on Eye Tracking Research & Applications (ETRA), ACM (2000), 23--27.
[18]
MacKenzie, I. S. An eye on input: research challenges in using the eye for computer input control. In Proc. Symposium on Eye-Tracking Research & Applications (ETRA), ACM (2010), 11--12.
[19]
Majaranta, P., and Räihä, K.-J. Twenty years of eye typing: systems and design issues. In Proc. Symposium on Eye Tracking Research & Applications (ETRA), ACM (2002), 15--22.
[20]
Møllenbach, E., Lillholm, M., Gail, A., and Hansen, J. P. Single gaze gestures. In Proc. Symposium on Eye-Tracking Research & Applications (ETRA), ACM (2010), 177--180.
[21]
Ohno, T. Features of eye gaze interface for selection tasks. In Computer Human Interaction, 1998. Proceedings. 3rd Asia Pacific (Jul 1998), 176--181.
[22]
Penkar, A. M., Lutteroth, C., and Weber, G. Designing for the eye: design parameters for dwell in gaze interaction. In Proc. Australian Computer-Human Interaction Conference (OzCHI), ACM (2012), 479--488.
[23]
Penkar, A. M., Lutteroth, C., and Weber, G. Eyes only: Navigating hypertext with gaze. In Proc. INTERACT, Springer (2013), 153--169.
[24]
Pfeuffer, K., Vidal, M., Turner, J., Bulling, A., and Gellersen, H. Pursuit calibration: Making gaze calibration less tedious and more flexible. In Proc. UIST, ACM (2013), 261--270.
[25]
Sibert, L. E., and Jacob, R. J. K. Evaluation of eye gaze interaction. In Proc. CHI, ACM (2000), 281--288.
[26]
Skovsgaard, H., Mateo, J. C., Flach, J. M., and Hansen, J. P. Small-target selection with gaze alone. In Proc. Symposium on Eye-Tracking Research & Applications (ETRA), ACM (2010), 145--148.
[27]
Stampe, D. M., and Reingold, E. M. Selection by looking: A novel computer interface and its application to psychological research. Studies in Visual Information Processing 6 (1995), 467--478.
[28]
Urbina, M. H., Lorenz, M., and Huckauf, A. Pies with eyes: the limits of hierarchical pie menus in gaze control. In Proc. Symposium on Eye-Tracking Research & Applications (ETRA), ACM (2010), 93--96.
[29]
Ware, C., and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. In Proc. CHI, ACM (1987), 183--188.
[30]
Yeoh, K. N., Lutteroth, C., and Weber, G. Eyes and keys: An evaluation of click alternatives combining gaze and keyboard. In Proc. INTERACT, Springer (2015).
[31]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (magic) pointing. In Proc. CHI, ACM (1999), 246--253.
[32]
Zhang, X., and MacKenzie, I. S. Evaluating eye tracking with iso 9241-part 9. In Proc. HCI International (LNCS 4552), Springer (2007), 779--788.

Cited By

View all
  • (2024)REVEAL: REal and Virtual Environments Augmentation Lab @ BathExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648658(1-5)Online publication date: 11-May-2024
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2023)EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart TargetsProceedings of the 29th Brazilian Symposium on Multimedia and the Web10.1145/3617023.3617058(16-24)Online publication date: 23-Oct-2023
  • Show More Cited By

Index Terms

  1. Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
    November 2015
    686 pages
    ISBN:9781450337793
    DOI:10.1145/2807442
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 November 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye gaze tracking
    2. web browser navigation

    Qualifiers

    • Research-article

    Conference

    UIST '15

    Acceptance Rates

    UIST '15 Paper Acceptance Rate 70 of 297 submissions, 24%;
    Overall Acceptance Rate 842 of 3,967 submissions, 21%

    Upcoming Conference

    UIST '24

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)159
    • Downloads (Last 6 weeks)16
    Reflects downloads up to 01 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)REVEAL: REal and Virtual Environments Augmentation Lab @ BathExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648658(1-5)Online publication date: 11-May-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2023)EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart TargetsProceedings of the 29th Brazilian Symposium on Multimedia and the Web10.1145/3617023.3617058(16-24)Online publication date: 23-Oct-2023
    • (2023)GazeRayCursor: Facilitating Virtual Reality Target Selection by Blending Gaze and Controller RaycastingProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615693(1-11)Online publication date: 9-Oct-2023
    • (2023)Integrating Gaze and Mouse Via Joint Cross-Attention Fusion Net for Students' Activity Recognition in E-learningProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108767:3(1-35)Online publication date: 27-Sep-2023
    • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
    • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
    • (2023)Phone Sleight of Hand: Finger-Based Dexterous Gestures for Physical Interaction with Mobile PhonesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581121(1-19)Online publication date: 19-Apr-2023
    • (2023)Designing Gaze-Based Interactions for Teleoperation: Eye Stick and Eye ClickInternational Journal of Human–Computer Interaction10.1080/10447318.2023.216999240:10(2500-2514)Online publication date: 26-Jan-2023
    • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media