Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3160504.3160532acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihcConference Proceedingsconference-collections
research-article

Analyzing the benefits of the combined interaction of head and eye tracking in 3D visualization information

Published: 23 October 2017 Publication History

Abstract

This work presents an evaluation of the joint interaction of eye tracking and head tracking in a 3D information visualization environment. In this context, it was conducted a task-based evaluation of the interactions, in a prototype using 3D scatter plots, such as navigation, selection, filters, and other typical interactions of an information visualization tool. The tasks mentioned were performed through interactions using head tracking for navigation and eye tracking for selection, and they were evaluated according to quantitative metrics (time and response of a questionnaire) and qualitative (extracted using the Think-Aloud Protocol). The results show that the "click by blinking" configuration was unstable, but the head tracking as a form of navigation showed a greater accuracy in the interaction.

References

[1]
Amer Al-Rahayfeh, Miad Faezipour. 2013. Eye tracking and head movement detection: A state-of-art survey. In IEEE journal of translational engineering in health and medicine, 1: 2100212-2100212.
[2]
Florian Alt, Stefan Schneegass, Jonas Auda, Rufat Rzayev and Nora Broy. 2014. Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays. In Proceedings of the 19th international conference on Intelligent User Interfaces, 267--272.
[3]
Doug A. Bowman, Ernst Kruijff, Joseph J. LaViola Jr., Ivan Poupyrev. 2004. 3D user interfaces: theory and practice. Addison Wesley Longman Publishing Co.
[4]
Doug A. Bowman, Jian Chen, Chadwick A. Wingrave, John Lucas, Andrew Ray, Nicholas F. Polys, Qing Li, Yonca Haciahmetoglu, Ji-Sun Kim, Seonho Kim, Robert Boehringer and Tao Ni. 2006. New directions in 3D user interfaces. IJVR, 5, 2: 3--14.
[5]
Bruno Dumas, Denis Lalanne, Sharon Oviatt. 2009. Multimodal Interfaces: A Survey of Principles, Models and Frameworks. In Human Machine Interaction, 5440: 3-- 26.
[6]
Giannis Drossis, Dimitris Grammenos, Ilia Adami, Constantine Stephanidis. 2013. 3D visualization and multimodal interaction with temporal information using timelines. In IFIP Conference on Human-Computer Interaction, 214--231.
[7]
Andrew T. Duchowski. 2017. Eye Tracking Methodology: theory and pratice. Springer International Publishing.
[8]
Lennon Furtado, Anderson Marques, Nelson Neto, Marcelle Motta, Bianchi Meiguins. 2016. IVOrpheus 2.0-A Proposal for Interaction by Voice Command-Control in Three Dimensional Environments of Information Visualization. In International Conference on Human Interface and the Management of Information, 347--360.
[9]
Amy M. Gill, Blair Nonnecke. 2012. Think Aloud: Effects, Validity. In: Proceedings of the 30th ACM International Conference on Design of Communication. New York, NY, USA: ACM. (SIGDOC '12), 31--36. ISBN 978-1-4503-1497-8.
[10]
Emilie Guy, Parinya Punpongsanon, Daisuke Iwai, Kosuke Sato. 2015. Tamy Boubekeur. LazyNav: 3D ground navigation with non-critical body parts. In 3D User Interfaces (3DUI), 43--50.
[11]
International Organization for Standardization. 2007. ISO 9241-400:2007: Ergonomics of human -- system interaction -- part 400: Principles and requirements for physical input devices. Retrieved May 22, 2017 from https://www.iso.org/standard/38896.html
[12]
Bing S. Kang. 1999. Hands Free Interface to a virtual reality environment using head tracking. U.S. Patent 6,009,210, Filed March 5, 1997, issued December 28, 1999.
[13]
LC Technology Inc. 2017. What is Eye tracking? Retrieved May 22, 2017 from http://www.eyegaze.com/what-is-eye-tracking/
[14]
Chang J. Lim, Donghan Kim. 2012. Development of gaze tracking interface for controlling 3D contents. Sensors and Actuators: A. Physical. 185: 151--159.
[15]
Cesar M. Loba. 2008. Enable Viacam. Retrieved May 22, 2017 from http://eviacam.sourceforge.net
[16]
Brunelli Miranda, Carlos Santos, Nikolas Carneiro, Tiago Araújo, Anderson Marques, Marcelle Mota, Nelson Neto, Bianchi Meiguins. 2016. Evaluation of Information Visualization Interaction Techniques Using Gestures and Widgets in 3D Environments. In International Conference on Virtual, Augmented and Mixed Reality. Springer International Publishing, 9740: 255--265.
[17]
Kumiyo Nakakoji, Atsuko Takashima, Yasuhiro Yamamoto. 2001. Cognitive Effects of Animated Visualization in Exploratory Visual Data Analysis. In Proceedings of Fifth International Conference on Information Visualization, 77--84.
[18]
Parinya Punpongsanon, Emilie Guy, Daisuke Iwai, Kosuke Sato, Tamy Boubekeur. 2016. Extended LazyNav: Virtual 3D Ground Navigation for Large Displays and Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics, 23, 8: 1952--1963.
[19]
Ben Shneiderman. 1996. The eyes have it: a task by data type taxonomy for informations visualizations. In Proceedings of IEEE Symposium on Visual Languages. 336--343.
[20]
The Eye Tribe. 2017. Our big mission. Retrieved May 22, 2017 from https://s3.eu-central-1.amazonaws.com/theeyetribe.com/theeyetribe.com/index.html
[21]
Ji S. Yi, Youn ah Kang, John Stasko, Julie Jacko (2007). Toward a deeper understanding of the role of interaction in information visualization. IEEE Transactions on Visualization and Computer Graphics, 13, 6: 1224--1231.
[22]
ByungIn Yoo, Jae-Joon Han, Changkyu Choi, Kwonju Yi, Sungjoo Suh, Dusik Park, Changyeong Kim. 2010. 3D user interface combining gaze and hand gestures for large-scale display. In CHI'10 Extended Abstracts on Human Factors in Computing Systems, 3709--3714.

Cited By

View all
  • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023

Index Terms

  1. Analyzing the benefits of the combined interaction of head and eye tracking in 3D visualization information

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    IHC '17: Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems
    October 2017
    622 pages
    ISBN:9781450363778
    DOI:10.1145/3160504
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    • UDESC: Santa Catarina State University
    • Springer
    • SBC: Sociedade Brasileira de Computação
    • ACM: Association for Computing Machinery
    • UFSCar: Federal University of São Carlos
    • CAPES: Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
    • UFPR: Universidade Federal do Paraná
    • CNPq: Conselho Nacional de Desenvolvimento Cientifico e Tecn
    • CGIBR: Comite Gestor da Internet no Brazil
    • OU: The Open University

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3D Visualization Information
    2. Eye Tracking
    3. Head Tracking
    4. Interaction

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    IHC 2017

    Acceptance Rates

    IHC '17 Paper Acceptance Rate 66 of 184 submissions, 36%;
    Overall Acceptance Rate 331 of 973 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)4
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media