Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2667317.2667414acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
tutorial

Gesturing on the Steering Wheel: a User-elicited taxonomy

Published: 17 September 2014 Publication History

Abstract

"Eyes on the road, hands on the wheel" is a crucial principle to be taken into account designing interactions for current in-vehicle interfaces. Gesture interaction is a promising modality that can be implemented following this principle in order to reduce driver distraction and increase safety. We present the results of a user elicitation for gestures performed on the surface of the steering wheel. We asked to 40 participants to elicit 6 gestures, for a total of 240 gestures. Based on the results of this experience, we derived a taxonomy of gestures performed on the steering wheel. The analysis of the results offers useful suggestions for the design of in-vehicle gestural interfaces based on this approach.

References

[1]
Angelini, L., Carrino, F., Carrino, S., et al. Opportunistic Synergy: a Classifier Fusion Engine for Micro-Gesture Recognition. Proc. AutomotiveUI'13, (2013), 30--37.
[2]
Bach, K.M., Jaeger, M.G., Skov, M.B., and Thomassen, N.G. You can touch, but you can't look: interacting with in-vehicle systems. Proc CHI'08, (2008), 1139--1148.
[3]
Döring, T., Kern, D., Marshall, P., et al. Gestural interaction on the steering wheel: reducing the visual demand. Proc.CHI'11, ACM (2011), 483--492.
[4]
Endres, C., Schwartz, T., and Müller, C. " Geremin ": 2D Microgestures for Drivers Based on Electric Field Sensing. Proc.IUI'11. (2011), 327--330.
[5]
González, I., Wobbrock, J., Chau, D.H., Faurling, A., and Myers, B. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. Proc. Graphics Interface'07, (2007), 95--102.
[6]
Jacob, R.J.K., Girouard, A., Horn, M.S., and Zigelbaum, J. Reality-Based Interaction: A Framework for Post-WIMP Interfaces. Proc. CHI'09, (2008), 201--210.
[7]
Jetter, H.-C., Reiterer, H., and Geyer, F. Blended Interaction: understanding natural human--computer interaction in post-WIMP interactive spaces. Personal and Ubiquitous Computing, (2013).
[8]
Koyama, S., Inami, M., Sugiura, Y., et al. Multi-touch steering wheel for in-car tertiary applications using infrared sensors. Proc. AH '14, (2014), 1--4.
[9]
Lahey, B., Girouard, A., and Burleson, W. PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays. Proc. CHI'11, (2011), 1303--1312.
[10]
Lee, S.-S., Kim, S., Jin, B., et al. How users manipulate deformable displays as input devices. Proc. CHI '10, (2010), 1647.
[11]
Morris, M.R., Wobbrock, J.O., and Wilson, A.D. Understanding Users' Preferences for Surface Gestures. Proc. Graphics Interface'10 (2010), 261--268.
[12]
Norman, D.A. Natural User Interfaces Are Not Natural. Interactions, (2010), 6--10.
[13]
Pfleging, B., Schneegass, S., and Schmidt, A. Multimodal Interaction in the Car - Combining Speech and Gestures on the Steering Wheel. Proc. AutomotiveUI'12, (2012), 155--162.
[14]
Rahman, A., Saboune, J., and El Saddik, A. Motion-path based in car gesture control of the multimedia devices. Proc.CHI'11. (2011), 69--76.
[15]
Riener, A. and Wintersberger, P. Natural, intuitive finger based input as substitution for traditional vehicle control Categories and Subject Descriptors. Proc. AutomotiveUI '11, (2011), 33--34.
[16]
Riener, A. Gestural Interaction in Vehicular Applications. Computer (2012), 42--47.
[17]
Riener, A., Weger, F., Ferscha, A., et al. Standardization of the in-car gesture interaction space. Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications -- Proc. AutomotiveUI '13, (2013), 14--21.
[18]
Wobbrock, J.O., Morris, M.R., and Wilson, A.D. User-defined gestures for surface computing. Proc.CHI '09, (2009), 1083.
[19]
Wolf, K., Naumann, A., Rohs, M., and Müller, J. A Taxonomy of Microinteractions: Defining Microgestures based on Ergonomic and Scenario-dependent Requirements. Proc. Interact'11, (2011), 559--575.
[20]
Valdes, C., Eastman, D., Grote, C., et al. Exploring the design space of gestural interaction with active tokens through user-defined gestures. Proc.CHI '14, (2014), 4107--4116.

Cited By

View all
  • (2024)GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of GraspingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661551(1275-1289)Online publication date: 1-Jul-2024
  • (2024)Expanding V2X with V2DUIs: Distributed User Interfaces for Media Consumption in the Vehicle-to-Everything EraProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3663643(394-401)Online publication date: 7-Jun-2024
  • (2024)From Smart Buildings to Smart Vehicles: Mobile User Interfaces for Multi-Environment Interactions2024 International Conference on Development and Application Systems (DAS)10.1109/DAS61944.2024.10541208(152-155)Online publication date: 23-May-2024
  • Show More Cited By

Index Terms

  1. Gesturing on the Steering Wheel: a User-elicited taxonomy

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AutomotiveUI '14: Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
    September 2014
    287 pages
    ISBN:9781450332125
    DOI:10.1145/2667317
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 September 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Design
    2. gestural interaction
    3. human factors

    Qualifiers

    • Tutorial
    • Research
    • Refereed limited

    Conference

    AutomotiveUI '14

    Acceptance Rates

    AutomotiveUI '14 Paper Acceptance Rate 36 of 79 submissions, 46%;
    Overall Acceptance Rate 248 of 566 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)53
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 01 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of GraspingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661551(1275-1289)Online publication date: 1-Jul-2024
    • (2024)Expanding V2X with V2DUIs: Distributed User Interfaces for Media Consumption in the Vehicle-to-Everything EraProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3663643(394-401)Online publication date: 7-Jun-2024
    • (2024)From Smart Buildings to Smart Vehicles: Mobile User Interfaces for Multi-Environment Interactions2024 International Conference on Development and Application Systems (DAS)10.1109/DAS61944.2024.10541208(152-155)Online publication date: 23-May-2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)Transferable Microgestures Across Hand Posture and Location Constraints: Leveraging the Middle, Ring, and Pinky FingersProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606713(1-17)Online publication date: 29-Oct-2023
    • (2023)Tap or Swipe? Effects of Interaction Gestures for Retrieval of Match Statistics via Second Screen on Watching Soccer on TVProceedings of the 2023 ACM International Conference on Interactive Media Experiences10.1145/3573381.3596473(303-308)Online publication date: 12-Jun-2023
    • (2023)SparseIMU: Computational Design of Sparse IMU Layouts for Sensing Fine-grained Finger MicrogesturesACM Transactions on Computer-Human Interaction10.1145/356989430:3(1-40)Online publication date: 10-Jun-2023
    • (2023)Affordance-Based and User-Defined Gestures for Spatial Tangible InteractionProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596032(1500-1514)Online publication date: 10-Jul-2023
    • (2023)In-vehicle Performance and Distraction for Midair and Touch Directional GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581335(1-13)Online publication date: 19-Apr-2023
    • (2022)Gesture and Voice Commands to Interact With AR Windshield Display in Automated Vehicle: A Remote Elicitation StudyProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3545257(171-182)Online publication date: 17-Sep-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media