Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3033701.3033713acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihcConference Proceedingsconference-collections
research-article

Suggestions for Interface Design Using Head Tracking and Voice Commands

Published: 04 October 2016 Publication History

Abstract

Multimodal Interactions have been used in many fields of application, such as medicine, manipulation of assistive technologies, interactions in public environments, among others. It is important to not only develop technologies (hardware and software) but to study and project optimization possibilities for the usage of these innovative interfaces as well. This work aims to identify which are the major problems when using head tracking interactions combined with voice commands, this being a multimodal interaction. This evaluation is focused in the lowest level of interaction, which are actions more physical and less cognitive, such as click, drag and drop, scroll a page, among others. Therefore, as consequence of this research, there were proposed some suggestions to improvement of interface projects that use this form of interaction.

References

[1]
Alejandro Jaimes and Nicu Sebe. 2007. Multimodal human--computer interaction: A survey. Computer Vision and Image Understanding. v. 108, p. 116--134.
[2]
Matthew Turk. 2014. Multimodal interaction: A review. Pattern Recognition Letters. v. 36, p. 189--195.
[3]
Bruno Dumas, Denis Lalanne and Sharon Oviatt. 2009. Multimodal Interfaces: A Survey of Principles, Models and Frameworks. In Human Machine Interaction. 3--26. Volume 5440 of the series Lecture Notes in Computer Science.
[4]
Rener B. Silva, Jessica Colnago & Junia Anacleto. 2014. Design de Aplicações Para Interação em Espaços Públicos: Formalizando as Lições Aprendidas. IHC'14, Brazilian Symposium on Human Factors in Computing Systems. Foz do Iguaçu, PR, Brasil.
[5]
Nina Valkanova, Sergi Jorda and Andrew V. Moere. 2015. Public visualization displays of citizen data: Design, impact and implications. International Journal of Human-Computer Studies. Baltimore, MD, USA.
[6]
Nina Valkanova, Robert Walter, Andrew V. Moere, Jörg Müller. 2014. MyPosition: Sparking Civic Discourse by a Public Interactive Poll Visualization. Conference on Computer Supported Cooperative Work. p. 1323--1332.
[7]
Kamran Sedig, Paul Parsons, Mark Dittmer, and Robert Haworth. 2014. Human-Centered Interactivity of Visualization Tools: Micro- and Macro-level Considerations. Springer New York
[8]
Luka Krapic, Kristijan Lenac, and Sandi Ljubic. 2013. Integrating Blink Click interaction into a head tracking system: implementation and usability issues. Springer-Verlag Berlin Heidelberg.
[9]
Ryosuke Sugai and Masamitsu Kurisu. 2015. Position Determination of a Popup Menu on Operation Screens of a Teleoperation System Using a Low Cost Head Tracker. 13--16. In 15th International Conference on Control,Automation and Systems. Busan, Korea.
[10]
Maria F. Roig-Maimó, Javier V. Gómez and C. Manresa-Yee. 2015. Face Me! Head-Tracker Interface Evaluation on Mobile Devices. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI '15). 1573--1578. http://dl.acm.org/citation.cfm?doid=2702613.2732829
[11]
Petar Aleksic, Cyril Allauzen, David Elson, Aleksandar Kracun, Diego M. Casado and Pedro J. Moreno. 2015. Improved Recognition of Contact Names in Voice Commands. In IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 5172--5175.
[12]
Bianchi S. Meiguins, Aruanda S. Gonçalves, Denis N. A. Santos, Marcelo B. Garcia, Rosevaldo D. S. Jr. 2003. Interação em Ambientes Virtuais Tridimensionais Utilizando Comandos de Voz. VI Symposium on Virtual Reality, Ribeirão preto. Oct. 15-18.
[13]
Monika. Elepfandt. 2012. Pointing and Speech - Comparison of Various Voice Commands. In Proceedings of the 7th Nordic Conference on Human - Computer Interaction: Making Sense Through Design (NordiCHI '12). 807--808. http://dl.acm.org/citation.cfm?doid=2399016.2399158
[14]
Eiichi Ito. 2001. Multi-modal Interface with Voice and Head Tracking for Multiple Home Appliances. Kanagawa Rehabilitacion Center.
[15]
Vikram Jeet, Hardeep S. Dhillon and Sandeep Bhatia. 2015. Radio Frequency Home Appliance Control based on Head Tracking and Voice Control for Disabled Person. In Communication Systems and Network Technologies (CSNT), 2015 Fifth International Conference on. 559--563. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=7279981
[16]
Kenton O'Hara, Gerardo Gonzalez, Abigail Sellen, Graeme Penney, Andreas Varnavas, Helena Mentis, Antonio Criminisi, Robert Corish, Mark Rouncefield, Neville Dastur and Tom Carrell. 2014. Touchless interaction in surgery. Magazine Communications of the ACM. New York, NY, USA.
[17]
Devanand G. Khandar; Manteand, R. V., Prashant N. Chatur. 2015. Vision Based Head Movement Tracking for Mouse control. International Journal of Advanced Research in Computer Science.
[18]
AbleNet Inc. Tracker Pro. Retrieved May 18, 2016 from https://www.ablenetinc.com/trackerpro.
[19]
Google Inc. Web Speech API. Retrieved May 18, 2016 from https://www.google.com/intl/pt/chrome/demos/ speech.html
[20]
Heidi Lam, Enrico Bertini, Petra Isenberg, Catherine Plaisant and Sheelagh Carpendale. 2012. Empirical Studies in Information Visualization: Seven Scenarios. In IEEE Transactions on Visualization and Computer Graphics (TVCG). 1520--1536. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6095544
[21]
Tobias Isenberg, Petra Isenberg, Jian Chen, Michael Sedlmair and Torsten Moller. 2013. A Systematic Review on the Practice of Evaluating Visualization. In IEEE Transactions on Visualization and Computer Graphics (TVCG). 2818--2827. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6634108
[22]
D3.Js. 2016. D3 Js. Retrieved May 18, 2016 from https://d3js.org/
[23]
Jakob Nielsen and Rolf Molich. 1990. Heuristic Evaluation of User Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). 249--256. http://dl.acm.org/citation.cfm?doid=97243.97281
[24]
Elise V. D. Hoven and Ali Mazalek. 2011. Grasping gestures: Gesturing with physical artifacts. In Artificial Intelligence for Engineering Design, Analysis and Manufacturing. 255--271.
[25]
Erick Murphy-Chutorian and Mohan M. Trivedi. 2009. Head Pose Estimation in Computer Vision: A Survey. In IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMN). 607--626. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=4497208

Cited By

View all
  • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023

Index Terms

  1. Suggestions for Interface Design Using Head Tracking and Voice Commands

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    IHC '16: Proceedings of the 15th Brazilian Symposium on Human Factors in Computing Systems
    October 2016
    431 pages
    ISBN:9781450352352
    DOI:10.1145/3033701
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Head Tracking
    2. Multimodal Interfaces
    3. Usability
    4. Voice Commands

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    IHC '16

    Acceptance Rates

    IHC '16 Paper Acceptance Rate 58 of 158 submissions, 37%;
    Overall Acceptance Rate 331 of 973 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media