Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2401836.2401849acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment

Published: 26 October 2012 Publication History

Abstract

A special human-computer interaction (HCI) framework processing user input in a multi-display environment has the ability to detect and interpret dynamic hand gesture input. In an environment equipped with large displays, full contactless application control is possible with this system. This framework was extended with a new input modality that involves human gaze in the interaction. The main contribution of this work is the possibility to unite any types of computer input and obtain a detailed view on the behaviour of every modality. Information is then available in the form of high speed data samples received in real time. The framework is designed with a special regard to gaze and hand gesture input modality in multi-display environments with large-area screens.

References

[1]
T. Bader and J. Beyerer. Influence of user's mental model on natural gaze behavior during human-computer interaction. 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction, Palo Alto, California, USA, February 2011.
[2]
T. Bader and J. Beyerer. Putting Gaze into Context: A Framework for Analyzing Gaze Behavior in Interactive and Dynamic Environments. International IUI 2010 Workshop on Eye Gaze in Intelligent Human Machine Interaction, Hong Kong, China, February 2010.
[3]
T. Bader, M. Vogelsang and E. Klaus. Multimodal Integration of Natural Gaze Behavior for Intention Recognition During Object Manipulation. International Conference on Multimodal Interfaces & the Workshop on Machine Learning for Multimodal Interfaces, Cambridge, Massachusetts, USA, November, 2009.
[4]
T. Bader, R. Räpple and J. Beyerer. Fast Invariant Contour-Based Classification of Hand Symbols for HCI. CAIP 2009, volume 5702 of LNCS, pages 689--696, Springer, 2009.
[5]
R. Vertegaal. Designing Attentive Interfaces. Proceedings of the 2002 symposium on Eye tracking research & applications, New Orleans, USA, 2002.
[6]
H. Bieg. Gaze-Augmented Manual Interaction. Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, Boston, USA, 2009.
[7]
D. D. Salvucci and J. H. Goldberg. Identifying Fixations and Saccades in Eye-Tracking Protocols. Nissan Cambridge Basic Research, Dept. of Industrial and Manufacturing Engineering, Pennsylvania, USA, 2000.
[8]
Image courtesy of Ergoneers GmbH. Ergoneers -- Ergonomic Engineers. www.ergoneers.com, 2011.

Cited By

View all
  • (2024)EAGLE: Eyegaze-Assisted Guidance and Learning Evaluation for Lifeloging RetrievalProceedings of the 7th Annual ACM Workshop on the Lifelog Search Challenge10.1145/3643489.3661115(18-23)Online publication date: 10-Jun-2024
  • (2023)Effects of activity time limitation on gesture elicitation for form creationJournal of Engineering Design10.1080/09544828.2023.227177334:11(963-985)Online publication date: 26-Oct-2023
  • (2016)Design and evaluation of a gaze tracking system for free-space interactionProceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct10.1145/2968219.2968336(1676-1685)Online publication date: 12-Sep-2016
  • Show More Cited By

Index Terms

  1. Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    Gaze-In '12: Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
    October 2012
    88 pages
    ISBN:9781450315166
    DOI:10.1145/2401836
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 October 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze based interaction
    2. hand gesture interaction
    3. multimodal interfaces

    Qualifiers

    • Research-article

    Conference

    ICMI '12
    Sponsor:
    ICMI '12: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
    October 26, 2012
    California, Santa Monica

    Acceptance Rates

    Overall Acceptance Rate 19 of 21 submissions, 90%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)16
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 12 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)EAGLE: Eyegaze-Assisted Guidance and Learning Evaluation for Lifeloging RetrievalProceedings of the 7th Annual ACM Workshop on the Lifelog Search Challenge10.1145/3643489.3661115(18-23)Online publication date: 10-Jun-2024
    • (2023)Effects of activity time limitation on gesture elicitation for form creationJournal of Engineering Design10.1080/09544828.2023.227177334:11(963-985)Online publication date: 26-Oct-2023
    • (2016)Design and evaluation of a gaze tracking system for free-space interactionProceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct10.1145/2968219.2968336(1676-1685)Online publication date: 12-Sep-2016
    • (2015)Gaze-Assisted User Intention Prediction for Initial Delay Reduction in Web Video AccessSensors10.3390/s15061467915:6(14679-14700)Online publication date: 19-Jun-2015
    • (2015)Remote Gaze and Gesture Tracking on the Microsoft KinectProceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction10.1145/2838739.2838778(167-176)Online publication date: 7-Dec-2015
    • (2015)Arcade+Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play10.1145/2793107.2793145(271-275)Online publication date: 5-Oct-2015
    • (2015)An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D ManipulationHuman-Computer Interaction – INTERACT 201510.1007/978-3-319-22668-2_25(315-330)Online publication date: 30-Aug-2015
    • (2014)Multimodal Input for Perceptual User InterfacesInteractive Displays10.1002/9781118706237.ch9(285-312)Online publication date: 12-Jul-2014

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media