Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1743666.1743710acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Single gaze gestures

Published: 22 March 2010 Publication History

Abstract

This paper examines gaze gestures and their applicability as a generic selection method for gaze-only controlled interfaces. The method explored here is the Single Gaze Gesture (SGG), i.e. gestures consisting of a single point-to-point eye movement. Horizontal and vertical, long and short SGGs were evaluated on two eye tracking devices (Tobii/QuickGlance (QG)). The main findings show that there is a significant difference in selection times between long and short SGGs, between vertical and horizontal selections, as well as between the different tracking systems.

References

[1]
Bee and André, 2008, Writing With your Eye: A Dwell Time Free Writing System Adapted to the Nature of the Human Eye Gaze, Perception in Multimodal Dialogue Systems, 111--122, Springer.
[2]
Duchowski, A. T., 2003. Eye Tracking Methodology: Theory and Practice, Springer-Verlag New York, Inc. Secaucus
[3]
Everling, S. & Fischer, B., 1998. The antisaccade: a review of basic research and clinical studies. Neuropsychologia, 36(9), 885--899
[4]
Huckauf, A., Goettel, T., Heinbockel, M., and Urbina, M. 2005. What you don't look at is what you get: anti-saccades can reduce the midas touch-problem. APGV '05, vol. 95. ACM, New York, NY, 170--170.
[5]
Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. 2008. Snap clutch, a moded approach to solving the Midas touch problem. ETRA '08. ACM, New York, NY, 221--228.
[6]
Istance, H., Vickers, S., and Hyrskykari, A. 2009. Gaze-based interaction with massively multiplayer on-line games. CHI EA '09. ACM, New York, NY, 4381--4386.
[7]
Jacob, R. J. K., 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS), 9(2), 152--169.
[8]
Kristjánsson, Á., Vandenbroucke, M. W. G. & Driver, J., 2004. When pros become cons for anti-versus prosaccades: factors with opposite or common effects on different saccade types. Experimental Brain Research, 155(2), 231--244.
[9]
Perlin, K. 1998. Quikwriting: continuous stylus-based text entry. In Proceedings of the 11th Annual ACM Symposium on User interface Software and Technology. UIST '98. ACM, New York, NY, 215--216.
[10]
Porta, M. and Turina, M. 2008. Eye-S: a full-screen input modality for pure eye-based communication. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (Savannah, Georgia, March 26--28, 2008. ETRA '08. ACM, New York, NY, 27--34.
[11]
Urbina, M. H., Huckauf, A., Dwell time free eye typing approaches, The 3rd Conference on Communication by Gaze Interaction -- COGAIN 2007, p. 65--70.
[12]
Vickers, S., Istance, H. O., Hyrskykari, A., Ali, N., Bates, R. Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users. Proc. 7th ICDVRAT with Art Abilitation, Maia, Portugal, 2008.
[13]
Ward, D. J., Blackwell, A. F., and MacKay, D. J. 2000. Dasher---a data entry interface using continuous gestures and language models. In Proceedings of the 13th Annual ACM Symposium on User interface Software and Technology (San Diego, California, United States, November 06--08, 2000). UIST '00. ACM, New York, NY, 129--137.
[14]
Wobbrock, J. O., Rubinstein, J., Sawyer, M. W., and Duchowski, A. T. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. ETRA '08. ACM, New York, NY, 11--18.

Cited By

View all
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '10: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
March 2010
353 pages
ISBN:9781605589947
DOI:10.1145/1743666
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 March 2010

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gaze gestures
  2. gaze interaction
  3. interaction design

Qualifiers

  • Research-article

Conference

ETRA '10
ETRA '10: Eye Tracking Research and Applications
March 22 - 24, 2010
Texas, Austin

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)0
Reflects downloads up to 01 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
  • (2023)Affordance-Guided User Elicitation of Interaction Concepts for Unimodal Gaze Control of Potential Holographic 3D UIs in Automotive Applications2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00011(14-19)Online publication date: 16-Oct-2023
  • (2023)Study on the brightness and graphical display object directions of the Single-Gaze-Gesture user interfaceDisplays10.1016/j.displa.2023.10253780(102537)Online publication date: Dec-2023
  • (2022)Emerging Wearable Biosensor Technologies for Stress Monitoring and Their Real-World ApplicationsBiosensors10.3390/bios1212109712:12(1097)Online publication date: 30-Nov-2022
  • (2022)Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze PointingProceedings of the ACM on Human-Computer Interaction10.1145/35677236:ISS(328-353)Online publication date: 14-Nov-2022
  • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
  • (2022)Methodological Standards in Accessibility Research on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 15-Dec-2022
  • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media