Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2168556.2168579acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Simple gaze gestures and the closure of the eyes as an interaction technique

Published: 28 March 2012 Publication History

Abstract

We created a set of gaze gestures that utilize the following three elements: simple one-segment gestures, off-screen space, and the closure of the eyes. These gestures are to be used as the moving tool in a gaze-only controlled drawing application. We tested our gaze gestures with 24 participants and analyzed the gesture durations, the accuracy of the stops, and the gesture performance. We found that the difference in gesture durations between short and long gestures was so small that there is no need to choose between them. The stops made by closing both eyes were accurate, and the input method worked well for this purpose. With some adjustments and with the possibility for personal settings, the gesture performance and the accuracy of the stops can become even better.

References

[1]
Drewes, H., and Schmidt, A. 2007. Interacting with the Computer Using Gaze Gestures. In Proceedings of INTERACT 2007. Lecture Notes in Computer Science 4663, Part II, Springer, 475--488.
[2]
Gips, J. and Olivieri, P. 1996. EagleEyes: An Eye Control System for Persons with Disabilities. In Proceedings of the Eleventh International Conference on Technology and Persons with Disabilities, Los Angeles, California. Available in http://www.cs.bc.edu/~gips/EagleEyes/EEReport.html (23.1.2012).
[3]
Heikkilä, H. and Räihä, K-J. 2009. Speed and Accuracy of Gaze Gestures. Journal of Eye Movement Research 3, 2:1, 1--14.
[4]
Hemmert, F., Djokic, D., and Wettach, R. 2008a. Spoken Words: Activating Text-To-Speech through Eye Closure. In CHI '08 Extended Abstracts on Human Factors in Computing Systems. ACM Press, ACM, 2325--2330.
[5]
Hemmert, F., Djokic, D., and Wettach, R. 2008b. Perspective Change: A System for Switching Between On-screen Views by Closing one Eye. In Proceedings of the working conference on Advanced Visual Interfaces (AVI '08). ACM Press, ACM, 484--485.
[6]
Hornof, A. and Cavender, A. 2005. EyeDraw: Enabling Children with Severe Motor Impairments to Draw with Their Eyes. In Proceedings of CHI 2005, ACM Press, ACM, 161--170.
[7]
Isokoski, p. 2000. Text Input Methods for Eye Trackers Using Off-Screen Targets. In Proceedings of ETRA 2000, ACM Press, ACM, 15--21.
[8]
Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. 2008. Snap Clutch, a Moded Approach to Solving the Midas Touch Problem. In Proceedings of ETRA 2008, ACM Press, ACM, 221--228.
[9]
Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., and Vickers, S. 2010. Designing Gaze Gestures for Gaming: an Investigation of Performance. In Proceedings of ETRA 2010, ACM Press, ACM, 323--330.
[10]
Jacob, R. J. K. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. ACM Transactions on Information Systems 9, 2, 152--169.
[11]
Kumar, M., Winograd, T., and Paepcke, A. 2007. Gaze-enhanced Scrolling Techniques. In CHI '07 Extended Abstracts on Human Factors in Computing Systems. ACM Press, ACM, 2531--2536.
[12]
Lankford, C. 2000. Effective Eye-gaze Input into Windows. In Proceedings of ETRA 2000, ACM Press, ACM, 23--27.
[13]
MyTobii 2011. User Manual: MyTobii Version 2.4. Available at http://www.tobii.com/Global/Assistive/Downloads_Training_Documents/MyTobii_P10/PDF/User%27s%20Manual/MyTobii_User_Manual_English.pdf (23.1.2012).
[14]
Møllenbach, E., Lillholm, M., Gail, a., and Hansen, J. P. 2010. Single Gaze Gestures. In Proceedings of ETRA 2010, ACM Press, ACM, 177--180.
[15]
Van der Kamp, J. and Sundstedt, V. 2011. Gaze and Voice Controlled Drawing. In Proceedings of Novel Gaze-Controlled Applications 2011 (NGCA '11), ACM Press, ACM.

Cited By

View all
  • (2024)EyeWithShut: Exploring Closed Eye Features to Estimate Eye PositionCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677605(157-161)Online publication date: 5-Oct-2024
  • (2024)Designing Upper-Body Gesture Interaction with and for People with Spinal Muscular Atrophy in VRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642884(1-19)Online publication date: 11-May-2024
  • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications
March 2012
420 pages
ISBN:9781450312219
DOI:10.1145/2168556
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 March 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. closure of both eyes
  2. eye tracking
  3. gaze control
  4. gaze gestures
  5. gaze-based interaction
  6. off-screen space

Qualifiers

  • Research-article

Conference

ETRA '12
ETRA '12: Eye Tracking Research and Applications
March 28 - 30, 2012
California, Santa Barbara

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)23
  • Downloads (Last 6 weeks)2
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)EyeWithShut: Exploring Closed Eye Features to Estimate Eye PositionCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677605(157-161)Online publication date: 5-Oct-2024
  • (2024)Designing Upper-Body Gesture Interaction with and for People with Spinal Muscular Atrophy in VRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642884(1-19)Online publication date: 11-May-2024
  • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
  • (2023)Trigger motion and interface optimization of an eye-controlled human-computer interaction system based on voluntary eye blinksHuman–Computer Interaction10.1080/07370024.2023.219585039:5-6(472-502)Online publication date: 24-Apr-2023
  • (2022)DEEP: 3D Gaze Pointing in Virtual Reality Leveraging Eyelid MovementProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545673(1-14)Online publication date: 29-Oct-2022
  • (2022)“I Don’t Want People to Look At Me Differently”Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517552(1-15)Online publication date: 29-Apr-2022
  • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media