Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-3-540-76702-2_13guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Designing Eyes-Free Interaction

Published: 15 March 2023 Publication History

Abstract

As the form factors of computational devices diversify, the concept of eyes-free interaction is becoming increasingly relevant: it is no longer hard to imagine use scenarios in which screens are inappropriate. However, there is currently little consensus about this term. It is regularly employed in different contexts and with different intents. One key consequence of this multiplicity of meanings is a lack of easily accessible insights into how to best build an eyes-free system. This paper seeks to address this issue by thoroughly reviewing the literature, proposing a concise definition and presenting a set of design principles. The application of these principles is then elaborated through a case study of the design of an eyes-free motion input system for a wearable device.

References

[1]
Brewster S., Lumsden J., Bell M., Hall M., and Tasker S. Multimodal ’eyes-free’ interaction techniques for wearable devices Proc. of CHI 2003 2003 New York ACM Press
[2]
Cheok A.D., Ganesh Kumar K., and Prince S. Horrocks I. and Hendler J. Micro-Accelerometer Based Hardware Interfaces for Wearable Computer Mixed Reality Applications The Semantic Web - ISWC 2002 2002 Heidelberg Springer
[3]
Cho, I., Sunwoo, J., Son, Y., Oh, M., Lee, C.: Development of a Single 3-axis Accelerometer Sensor Based Wearable Gesture Recognition Band. In: Proceedings of Ubiquitous Intelligence and Computing, Hong Kong (2007)
[4]
Crease, M.C., Brewster, S.A.: Making progress with sounds: The design and evaluation of an audio progress bar. In: Proc. of ICAD 1998, Glasgow, UK (1998)
[5]
Costanza E., Inverso S.A., Allen R., and Maes P. Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures Proc. of CHI 2007 2007 New York ACM Press
[6]
Gaver W.W., Smith R.B., and O’Shea T. Effective sounds in complex systems: the ARKOLA simulation Proc. of CHI 1991 1991 New York ACM Press
[7]
Kallio S., Kela J., Mäntyjärvi J., and Plomp J. Visualization of hand gestures for pervasive computing environments AVI 2006 2006 New York ACM Press
[8]
Kurtenbach G., Sellen A., and Buxton W. An empirical evaluation of some articulatory and cognitive aspects of ”marking menus” Human Computer Interaction 1993 8 1 1-23
[9]
Oakley I. and O’Modhrain S. Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface Proceedings of World Haptics 2005 2005 Los Alamitos IEEE Press
[10]
Oakley I. and Park J. The Effect of a Distracter Task on the Recognition of Tactile Icons The proceedings of WorldHaptics 2007 2007 IEEE Press IEEE Press
[11]
Oakley I. and Park J. A motion-based marking menu system Extended Abstracts of CHI 2007 2007 New York ACM Press
[12]
Partridge K., Chatterjee S., Sazawal V., Borriello G., and Want R. Tilt-Type: Accelerometer-Supported Text Entry for Very Small Devices Proc. of ACM UIST 2002 New York ACM Press
[13]
Pirhonen A., Brewster S.A., and Holguin C. Gestural and Audio Metaphors as a Means of Control for Mobile Devices Proceedings of CHI 2002 2002 New York ACM Press
[14]
Poupyrev I., Maruyama S., and Rekimoto J. Ambient touch: designing tactile interfaces for handheld devices Proc. of ACM UIST 2002 2002 New York ACM Press
[15]
Rekimoto, J.: Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In: Proc. of ISWC 2001 (2001)
[16]
Roto V. and Oulasvirta A. Need for non-visual feedback with long response times in mobile HCI proceedings of WWW 2005 2005 New York ACM Press
[17]
Smyth T.N. and Kirkpatrick A.E. A new approach to haptic augmentation of the GUI Proceedings of ICMI 2006 2006 New York ACM Press
[18]
Sutcliffe A. On the effective use and reuse of HCI knowledge ACM Trans. Comput.-Hum. Interact. 2000 7 2 197-221
[20]
Tan H.Z., Srinivasan M.A., Eberman B., and Cheng B. Human factors for the design of force-reflecting haptic interfaces Proceedings of ASME Dynamic Systems and Control Division 1994 Chicago, IL ASME 353-359
[21]
Watson M. and Sanderson P. Sonification Supports Eyes-Free Respiratory Monitoring and Task Time-Sharing Human Factors 2004 46 3 497-517
[22]
Wigdor D. and Balakrishnan R. TiltText: Using tilt for text input to mobile phones Proc. of ACM UIST 2003 2003 New York ACM Press
[23]
Williamson J., Murray-Smith R., and Hughes S. Shoogle: excitatory multimodal interaction on mobile devices Proceedings CHI 2007. 2007 New York ACM Press
[24]
Witt H., Nicolai T., and Kenn H. Designing a Wearable User Interface for Hands-free Interaction in Maintenance Applications Proceedings of IEEE International Conference on Pervasive Computing and Communications 2006 Los Alamitos IEEE Computer Society Press
[25]
Xsens Motion Technologies, www.xsens.com
[26]
Yin M. and Zhai S. The benefits of augmenting telephone voice menu navigation with visual browsing and search Proc. of ACM CHI 2006 2006 New York ACM Press
[27]
Zhao S., Dragicevic P., Chignell M., Balakrishnan R., and Baudisch P. Earpod: eyes-free menu selection using touch input and reactive audio feedback Proceedings of CHI 2007 2007 New York ACM Press

Cited By

View all
  • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
  • (2023)Cyclists’ Use of Technology While on Their BikeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580971(1-15)Online publication date: 19-Apr-2023
  • (2021)Understanding the Design Space of Mouth MicrogesturesProceedings of the 2021 ACM Designing Interactive Systems Conference10.1145/3461778.3462004(1068-1081)Online publication date: 28-Jun-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
Haptic and Audio Interaction Design: Second International Workshop, HAID 2007 Seoul, South Korea, November 29-30, 2007 Proceedings
Nov 2007
132 pages
ISBN:978-3-540-76701-5
DOI:10.1007/978-3-540-76702-2

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 15 March 2023

Author Tags

  1. Eyes-free interaction
  2. design principles
  3. motion input

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
  • (2023)Cyclists’ Use of Technology While on Their BikeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580971(1-15)Online publication date: 19-Apr-2023
  • (2021)Understanding the Design Space of Mouth MicrogesturesProceedings of the 2021 ACM Designing Interactive Systems Conference10.1145/3461778.3462004(1068-1081)Online publication date: 28-Jun-2021
  • (2021)Wearable Interactions for Users with Motor Impairments: Systematic Review, Inventory, and Research ImplicationsProceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3441852.3471212(1-15)Online publication date: 17-Oct-2021
  • (2013)Comparing modalities and feedback for peripheral interactionCHI '13 Extended Abstracts on Human Factors in Computing Systems10.1145/2468356.2468582(1263-1268)Online publication date: 27-Apr-2013

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media