Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2390256.2390274acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Hand gesture-based visual user interface for infotainment

Published: 17 October 2012 Publication History

Abstract

We present a real-time vision-based system that discriminates hand gestures performed by in-vehicle front-row seat occupants for accessing the infotainment system. The hand gesture-based visual user interface may be more natural and intuitive to the user than the current tactile interaction interface. Consequently, it may encourage a gaze-free interaction, which can alleviate driver distraction without limiting the user's infotainment experience. The system uses visible and depth images of the dashboard and center-console area in the vehicle. The first step in the algorithm uses the representation of the image area given by a modified histogram-of-oriented-gradients descriptor and a support vector machine (SVM) to classify whether the driver, passenger, or no one is interacting with the region of interest. The second step extracts gesture characteristics from temporal dynamics of the features derived in the initial step, which are then inputted to a SVM in order to perform gesture classification from a set of six classes of hand gestures. The rate of correct user classification into one of the three classes is 97.9% on average. Average hand gesture classification rates for the driver and passenger using color and depth input are above 94%. These rates were achieved on in-vehicle collected data over varying illumination conditions and human subjects. This approach demonstrates the feasibility of the hand gesture-based in-vehicle visual user interface.

References

[1]
Chang, C.-C., and Lin, C.-J. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2 (2011), 27:1--27:27. Software available at http://www.csie.ntu.edu.tw/cjlin/libsvm.
[2]
Cheng, S. Y., and Trivedi, M. M. Vision-based infotainment user determination by hand recognition for driver assistance. IEEE Transactions on Intelligent Transportation Systems 11, 3 (September 2010), 759--764.
[3]
Dalal, N., and Triggs, B. Histograms of oriented gradients for human detection. In Proc. IEEE CVPR (2005), vol. 1, pp. 886--893.
[4]
Doshi, A., Cheng, S. Y., and Trivedi, M. M. A novel active heads-up display for driver assistance. IEEE Transactions on Systems, Man, and Cybernetics, Part B 39, 1 (2009), 85--93.
[5]
Drews, F. A., Yazdani, H., Godfrey, C. N., Cooper, J. M., and Strayer, D. L. Text messaging during simulated driving. The Journal of the Human Factors and Ergonomics Society, 51 (January 2011), 762--770.
[6]
Ecker, R., Broy, V., Hertzschuch, K., and Butz, A. Visual cues supporting direct touch gesture interaction with in-vehicle information systems. In Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (New York, NY, USA, 2010), AutomotiveUI '10, ACM, pp. 80--87.
[7]
Harter, J. J. E., Scharenbroch, G. K., Fultz, W. W., Griffin, D. P., and Witt, G. J. User discrimination control of vehicle infotainment system, u.s. patent 6 668 221.
[8]
Horrey, W. J. Assesing the effects of in-vehicle tasks on driving performance. Ergonomics in Design, 19 (December 2011), 4--7.
[9]
Jahn, G., Krems, J. F., and Gelau, C. Skill aquisition while operating in-vehicle information systems: Interface design determines the level of safety-relevant distractions. Human Factors and Ergonomics Society, 51 (June 2009), 136--151.
[10]
Kolsch, M., and Turk, M. Analysis of rotational robustness of hand detection with viola jones method. In ICPR (2004), pp. 107--110.
[11]
Lee, J. D., Roberts, S. C., Hoffman, J. D., and Angell, L. S. Scrolling and driving: How an mp3 player and its aftermarket controller affect driving performance and visual behavior. The Journal of the Human Factors and Ergonomics Society (January 2012).
[12]
Perez, M. A. Safety implications of infotainment system use in naturalistic driving. Work: A Journal of Prevention, Assessment and Rehabilitation 41 (2012), 4200--4204.
[13]
Pitts, M. J., Williams, M. A., Wellings, T., and Attridge, A. Assessing subjective response to haptic feedback in automotive touchscreens. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications (New York, NY, USA, 2009), AutomotiveUI '09, ACM, pp. 11--18.
[14]
Richter, H., Ecker, R., Deisler, C., and Butz, A. Haptouch and the 2+1 state model: potentials of haptic feedback on touch based in-vehicle information systems. In Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (New York, NY, USA, 2010), AutomotiveUI '10, ACM, pp. 72--79.
[15]
Riener, A., and Wintersberger, P. Natural, intuitive finger-based input as a means of in-vehicle information system operation. In Automotive User Interfaces and Interactive Vehicular Applications (December 2011), pp. 159--166.
[16]
Tran, C., and Trivedi, M. Driver assistance for "keeping hands on the wheel and eyes on the road". In Vehicular Electronics and Safety (ICVES), 2009 IEEE International Conference on (nov. 2009), pp. 97--101.
[17]
Tran, C., and Trivedi, M. M. Vision for driver assistance: Looking at people in a vehicle. In Visual Analysis of Humans, T. B. Moeslund, A. Hilton, V. Kruger, and L. Sigal, Eds. Springer London, 2011, pp. 597--614.
[18]
Trivedi, M. M., and Cheng, S. Y. Holistic sensing and active displays for intelligent driver support systems. Computer 40, 5 (May 2007), 60--68.
[19]
Trivedi, M. M., Gandhi, T., and McCall, J. Looking-in and looking-out of a vehicle: Computer-vision-based enhanced vehicle safety. IEEE Trans. Intell. Transp. Syst. 8, 1 (Mar 2007), 108--120.
[20]
Williamson, A. R., Young, K. L., Navarro, J., and Lenne, M. G. Music selection using a touch screen interface: effect of auditory and visual feedback on driving and usability. International Journal of Vehicle Design 57, 4 (2011), 391--404.

Cited By

View all
  • (2024)Expanding V2X with V2DUIs: Distributed User Interfaces for Media Consumption in the Vehicle-to-Everything EraProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3663643(394-401)Online publication date: 7-Jun-2024
  • (2023)Enhancing User Engagement in Shared Autonomous Vehicles: An Innovative Gesture-Based Windshield Interaction SystemApplied Sciences10.3390/app1317990113:17(9901)Online publication date: 1-Sep-2023
  • (2023)Pair-Less Bluetooth for Touchless Interaction2023 IEEE 20th Consumer Communications & Networking Conference (CCNC)10.1109/CCNC51644.2023.10059727(871-874)Online publication date: 8-Jan-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AutomotiveUI '12: Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
October 2012
280 pages
ISBN:9781450317511
DOI:10.1145/2390256
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 October 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Kinect
  2. contact-free
  3. hand-gesture recognition
  4. infotainment
  5. user determination

Qualifiers

  • Research-article

Conference

AutomotiveUI '12

Acceptance Rates

Overall Acceptance Rate 248 of 566 submissions, 44%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)45
  • Downloads (Last 6 weeks)2
Reflects downloads up to 16 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Expanding V2X with V2DUIs: Distributed User Interfaces for Media Consumption in the Vehicle-to-Everything EraProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3663643(394-401)Online publication date: 7-Jun-2024
  • (2023)Enhancing User Engagement in Shared Autonomous Vehicles: An Innovative Gesture-Based Windshield Interaction SystemApplied Sciences10.3390/app1317990113:17(9901)Online publication date: 1-Sep-2023
  • (2023)Pair-Less Bluetooth for Touchless Interaction2023 IEEE 20th Consumer Communications & Networking Conference (CCNC)10.1109/CCNC51644.2023.10059727(871-874)Online publication date: 8-Jan-2023
  • (2023)In-vehicle air gesture design: impacts of display modality and control orientationJournal on Multimodal User Interfaces10.1007/s12193-023-00415-817:4(215-230)Online publication date: 14-Sep-2023
  • (2023)Human Factors in DrivingHandbook of Human‐Machine Systems10.1002/9781119863663.ch28(333-347)Online publication date: 7-Jul-2023
  • (2022)Sensoring the Neck: Classifying Movements and Actions with a Neck-Mounted Wearable DeviceSensors10.3390/s2212431322:12(4313)Online publication date: 7-Jun-2022
  • (2022)A Design Space for Human Sensor and Actuator Focused In-Vehicle Interaction Based on a Systematic Literature ReviewProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346176:2(1-51)Online publication date: 7-Jul-2022
  • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
  • (2022)Mid-air gestures for in-vehicle media player: elicitation, segmentation, recognition, and eye-tracking testingSN Applied Sciences10.1007/s42452-022-04992-34:4Online publication date: 17-Mar-2022
  • (2022)Infotainment (Displays & Controls) I: Haptics/UltrasoundUser Experience Design in the Era of Automated Driving10.1007/978-3-030-77726-5_16(425-443)Online publication date: 1-Jan-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media