Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2168556.2168578acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Eye-based head gestures

Published: 28 March 2012 Publication History

Abstract

A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze on the interaction object while interacting. This method has been implemented on a head-mounted eye tracker for detecting a set of predefined head gestures. The accuracy of the gesture classifier is evaluated and verified for gaze-based interaction in applications intended for both large public displays and small mobile phone screens. The user study shows that the method detects a set of defined gestures reliably.

References

[1]
Bee, N. and Andre, E. 2008. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In Perception in Multimodal Dialogue Systems, LNCS 5078/2008, Springer, pages 111--122.
[2]
Darwin, C. 1872/1998. The Expression of the Emotions in Man and Animals, third edition. New York, Oxford University Press.
[3]
Davis, J. W., and Vaks, S. 2001. A Perceptual User Interface for Recognizing Head Gesture Acknowledgements. In Proceedings Workshop on Perceptive User Interfaces.
[4]
Drewes, H. 2010. Eye Gaze Tracking for Human Computer Interaction. PhD thesis, Faculty of Mathematics, Computer Science and Statistics, LMU München.
[5]
Drewes, H., and Schmidt, A. 2007. Interacting with the Computer using Gaze Gestures. In Proceedings of Human-Computer Interaction - INTERACT 2007, Springer LNCS 4663, pages 475--488.
[6]
Ekman, P., and Friesen, W., 1978. Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto.
[7]
Gale A. G., 2005. Attention Responsive Technology and Ergonomics. In Bust P. D. & McCabe P. T. (Eds.) Contemporary Ergonomics, Proceedings of the Ergonomics Society Annual Conference, Hatfield University, Hertfordshire, pages 273--276.
[8]
Hansen, D. W., and Ji, Q., 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell, pages 478--500.
[9]
Hartley, R., and Zisserman, A. 2000. Multiple view geometry in computer vision. Cambridge University Press, Cambridge, UK.
[10]
Isokoski, P. 2000. Text Input Methods for Eye Trackers Using Off-Screen Targets. In Proceedings of the ACM symposium on Eye tracking research & applications ETRA '00. ACM Press, pages 15--21.
[11]
Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., and Vickers, S.2010. Designing Gaze Gestures for Gaming: an Investigation of Performance. In Proceedings of the ACM symposium on Eye tracking research & applications ETRA '10, ACM Press, New York, NY March.
[12]
Kapoor, A., and Picard, R. W. 2002. A real-time head nod and shake detector. Technical Report 544, MIT Media Laboratory Affective Computing Group.
[13]
Kjeldsen, R. 2001. Head gestures for computer control. In Proceedings Second International Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real time Systems, pages 62--67.
[14]
Mardanbegi, D., Hansen, D. W. 2011. Mobile gaze-based screen interaction in 3D environments, Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA2011), Blekinge Institute of Technology, Karlskrona, Sweden.
[15]
Møllenbach, E. 2010. Selection strategies in gaze interaction. PhD thesis, Innovative communication group, IT University of Copenhagen.
[16]
Morris, D. 1994. Body talk: The Meaning of Human Gestures. Crown Publishers, New York.
[17]
Nonaka, H. 2003. Communication interface with eye-gaze and head gesture using successive DP matching and fuzzy inference. Journal of Intelligent Information Systems 21(2): pages 105--112.
[18]
Panerai, F. and Sandini, G. 1998. Oculo-Motor Stabilization Reflexes: Integration of Inertial and Visual Information. Neural Networks, 11, pages 1191--1204.
[19]
Porta, M., and Turina, M. 2008. Eye-S: a Full-Screen Input Modality for Pure Eye-based Communication. In Proceedings of the ACM symposium on Eye tracking research & applications ETRA '08. ACM Press, pages 27--34.
[20]
Shi, F., Gale, A. G. & Purdy, K. J. 2006. Eye-centric ICT control. In Bust P. D. & McCabe P. T. (Eds.) Contemporary Ergonomics, Proceedings of the Ergonomics Society Annual Conference, pages 215--218.
[21]
Toyama, K. 1998. Look, ma-no hands! Hands free cursor control with real-time 3D face tracking. In Proceedings Workshop on Perceptual User Interfaces (PUI'98), pages 49--54.
[22]
Wobbrock, J. O., Rubinstein, J., Sawyer, M., and Duchowski, A. T. 2007. Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures. In Proceedings of COGAIN '07, pages 61--64.
[23]
Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual And Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '99, ACM Press, pages 246--253.

Cited By

View all
  • (2024)Hands-free Selection in Scroll Lists for AR DevicesProceedings of Mensch und Computer 202410.1145/3670653.3670671(323-330)Online publication date: 1-Sep-2024
  • (2024)Exploring Optimal Placement of Head-Based Hierarchical Marking Menus on SmartphonesMobile and Ubiquitous Systems: Computing, Networking and Services10.1007/978-3-031-63992-0_25(381-392)Online publication date: 19-Jul-2024
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications
March 2012
420 pages
ISBN:9781450312219
DOI:10.1145/2168556
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 March 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracker
  2. gaze interaction
  3. head gestures
  4. interaction

Qualifiers

  • Research-article

Conference

ETRA '12
ETRA '12: Eye Tracking Research and Applications
March 28 - 30, 2012
California, Santa Barbara

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)2
Reflects downloads up to 01 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Hands-free Selection in Scroll Lists for AR DevicesProceedings of Mensch und Computer 202410.1145/3670653.3670671(323-330)Online publication date: 1-Sep-2024
  • (2024)Exploring Optimal Placement of Head-Based Hierarchical Marking Menus on SmartphonesMobile and Ubiquitous Systems: Computing, Networking and Services10.1007/978-3-031-63992-0_25(381-392)Online publication date: 19-Jul-2024
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2023)Vergence Matching: Inferring Attention to Objects in 3D Environments for Gaze-Assisted SelectionProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580685(1-15)Online publication date: 19-Apr-2023
  • (2023)Exploring Trajectory Data in Augmented Reality: A Comparative Study of Interaction Modalities2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00094(790-799)Online publication date: 16-Oct-2023
  • (2023)Leap to the Eye: Implicit Gaze-based Interaction to Reveal Invisible Objects for Virtual Environment Exploration2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00036(214-222)Online publication date: 16-Oct-2023
  • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
  • (2022)Responsive human-computer interaction model based on recognition of facial landmarks using machine learning algorithmsMultimedia Tools and Applications10.1007/s11042-022-12775-681:13(18011-18031)Online publication date: 8-Mar-2022
  • (2021)GazeWheels: Recommendations for using wheel widgets for feedback during dwell-time gaze inputit - Information Technology10.1515/itit-2020-004263:3(145-156)Online publication date: 13-May-2021
  • (2021)HGaze Typing: Head-Gesture Assisted Gaze TypingACM Symposium on Eye Tracking Research and Applications10.1145/3448017.3457379(1-11)Online publication date: 25-May-2021
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media