Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3490632.3497836acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
poster

Exploring Potential Gestures for Controlling an Eye-Tracker Based System

Published: 25 February 2022 Publication History

Abstract

Body gestures could be used as an intuitive interaction method with computerized systems. Previous studies explored gesture-based interaction mostly with digital displays, thus there is no standard set of gestures for a system that lacks a display. In this work we conducted a pilot study to explore the potential of using gestures to control an eye tracker based mobile museum visitors’ guide. Our objective was to identify a user-defined set of gestures for controlling the mobile guide. In this work we present the preliminary results of the experiment and discuss the participants’ suggestions and concerns for using this type of interaction.

References

[1]
Moayad Mokatren, Tsvi Kuflik, and Ilan Shimshoni. 2018. Exploring the potential of a mobile eye tracker as an intuitive indoor pointing device: A case study in cultural heritage. Future generation computer systems, 81,528-541. http://dx.doi.org/10.1016/j.future.2017.07.007
[2]
Jackie Moyes. 1993. Icon design and its effect on guessability, learnability, and experienced user performance. People and computers, (8), 49-60.
[3]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the guessability of symbolic input. In CHI'05 extended abstracts on Human Factors in Computing Systems, 1869-1872.
[4]
Roland Aigner, Daniel Wigdor, Hrvoje Benko, Michael Haller, David Lindbauer, Alexandra Ion, Shengdong Zhao, and Jeffrey Koh. 2012. Understanding mid-air hand gestures: A study of human preferences in usage of gesture types for hci. Microsoft Research TechReport MSR-TR-2012-111 (2), 30-40.

Cited By

View all
  • (2022)3D Gaze Estimation Using RGB-IR CamerasSensors10.3390/s2301038123:1(381)Online publication date: 29-Dec-2022

Index Terms

  1. Exploring Potential Gestures for Controlling an Eye-Tracker Based System
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    MUM '21: Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia
    December 2021
    263 pages
    ISBN:9781450386432
    DOI:10.1145/3490632
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 February 2022

    Check for updates

    Author Tags

    1. Body gestures
    2. Cultural heritage
    3. Eye tacking
    4. Multimodality

    Qualifiers

    • Poster
    • Research
    • Refereed limited

    Conference

    MUM 2021

    Acceptance Rates

    Overall Acceptance Rate 190 of 465 submissions, 41%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 11 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)3D Gaze Estimation Using RGB-IR CamerasSensors10.3390/s2301038123:1(381)Online publication date: 29-Dec-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media