Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3010915.3010949acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

UGI: a multi-dimensional ultrasonic-based interaction approach

Published: 29 November 2016 Publication History

Abstract

We are currently witnessing an era where interaction with computers is no longer limited to conventional methods (i.e. keyboard and mouse). Human Computer Interaction (HCI) as a progressive field of research, has opened up alternatives to the traditional interaction techniques. Embedded Infrared (IR) sensors, Accelerometers and RGBD cameras have become common inputs for devices to recognize gestures and body movements. These sensors are vision based and as a result the devices that incorporate them will be reliant on presence of light. Ultrasonic sensors on the other hand do not suffer this limitation as they utilize properties of sound waves. These sensors however, have been mainly used for distance detection and not with HCI devices. This paper presents our approach in developing a multi-dimensional interaction input method and tool Ultrasonic Gesture-based Interaction (UGI) that utilizes ultrasonic sensors. We demonstrate how these sensors can detect object movements and recognize gestures. We present our approach in building the device and demonstrate sample interactions with it. We have also conducted a user study to evaluate our tool and its distance and micro gesture detection accuracy. This paper reports these results and outlines our future work in the area.

References

[1]
Karray, F., et al., Pattern Analysis and Machine Intelligence Lab., Department of Electrical and Computer Engineering University of Waterloo, Waterloo, Canada.
[2]
Nicolae, I.-E., L. Acqualagna, and B. Blankertz, Tapping Neural Correlates of the Depth of Cognitive Processing for Improving Human Computer Interaction, in Symbiotic Interaction. 2015, Springer. p. 126--131.
[3]
Marcus, A., HCI and User-Experience Design: Fast-Forward to the Past, Present, and Future. 2015: Springer.
[4]
Gleeson, B., et al. Gestures for industry: intuitive humanrobot communication from human observation. in Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction. 2013. IEEE Press.
[5]
Jetter, H.-C., H. Reiterer, and F. Geyer, Blended Interaction: understanding natural human-computer interaction in post-WIMP interactive spaces. Personal and Ubiquitous Computing, 2014. 18(5): p. 1139--1158.
[6]
Hespanhol, L. and M. Tomitsch, Strategies for intuitive interaction in public urban spaces. Interacting with Computers, 2015: p. iwu051.
[7]
Suarez, J. and R.R. Murphy. Hand gesture recognition with depth images: A review. in 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. 2012. IEEE.
[8]
Valente, J. and S. Soatto. Perspective distortion modeling, learning and compensation. in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 2015.
[9]
Cheng, H., L. Yang, and Z. Liu, A survey on 3d hand gesture recognition. 2015.
[10]
Ruiz-Sarmiento, J., C. Galindo, and J. Gonzalez-Jimenez, Experimental Study of the Performance of the Kinect Range Camera for Mobile Robotics.
[11]
Bujnowski, A., et al. Comparison of active proximity radars for the wearable devices. in Human System Interactions (HSI), 2015 8th International Conference on. 2015. IEEE.
[12]
Bauer, W., Three-dimensional displacement of a body with computer interface. 1993, Google Patents.
[13]
Pu, Q., et al. Whole-home gesture recognition using wireless signals. in Proceedings of the 19th annual international conference on Mobile computing & networking. 2013. ACM.
[14]
Wu, Y. and T.S. Huang, Vision-based gesture recognition: A review, in Gesture-based communication in human-computer interaction. 1999, Springer. p. 103--115.
[15]
Badi, H., A Survey on Recent Vision-Based Gesture Recognition. Intelligent Industrial Systems, 2016: p. 1--13.
[16]
Swapna, B., F. Pravin, and V.D. Rajiv, Hand gesture recognition system for numbers using thresholding, in Computational Intelligence and Information Technology. 2011, Springer. p. 782--786.
[17]
Hasan, H. and S. Abdul-Kareem, Static hand gesture recognition using neural networks. Artificial Intelligence Review, 2014. 41(2): p. 147--181.
[18]
Zhang, Y. and C. Harrison. Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition. in Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 2015. ACM.
[19]
Przybyla, R.J., et al. 12.1 3D ultrasonic gesture recognition. in Solid-State Circuits Conference Digest of Technical Papers (ISSCC), 2014 IEEE International. 2014. IEEE.
[20]
Motwani, C., D. Motwani, and A. Kasatwar, Six Sense Technology Using Hand Gesture. International Journal of Research, 2016. 3(5): p. 93--99.
[21]
Devi, M., S. Saharia, and D. Bhattacharyya, Dance Gesture Recognition: A Survey. International Journal of Computer Applications, 2015. 122(5).
[22]
Dobrusin, I., P. Gavrikov, and M. Killi. Human Body Models. in Seminar Course.
[23]
Gu, C., Z. Peng, and C. Li, High-Precision Motion Detection Using Low-Complexity Doppler Radar With Digital Post-Distortion Technique. IEEE Transactions on Microwave Theory and Techniques, 2016. 64(3): p. 961--971.
[24]
Henry, P., et al., RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. The International Journal of Robotics Research, 2012. 31(5): p. 647--663.
[25]
Gasparrini, S., et al., A depth-based fall detection system using a Kinect® sensor. Sensors, 2014. 14(2): p. 2756--2775.
[26]
Cronin, D., Usability of Micro-vs. Macro-Gestures in Camera-Based Gesture Interaction. 2014, Diplomarbeit, California Polytechnic State University, San Luis Obispo, 2013.(Zitiert auf Seite 21).
[27]
Babaei, M., N. Makhzani, and P. Parsi. An optimised 3D view of Kinect live stream. in Open Systems (ICOS), 2015 IEEE Confernece on. 2015. IEEE.
[28]
Kopper, R.A.P., Understanding and Improving Distal Pointing Interaction. 2011.
[29]
Motion, L., Leap motion controller. 2014.
[30]
Hernoux, F. and O. Christmann, A seamless solution for 3D real-time interaction: design and evaluation. Virtual Reality, 2015. 19(1): p. 1--20.
[31]
Guna, J., et al., An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors, 2014. 14(2): p. 3702--3720.
[32]
Kreczmer, B. Gestures recognition by using ultrasonic range-finders. in 2011 16th International Conference on Methods & Models in Automation & Robotics. 2011.
[33]
Minami, M., et al., DOLPHIN: a practical approach for implementing a fully distributed indoor ultrasonic positioning system, in UbiComp 2004: Ubiquitous Computing. 2004, Springer. p. 347--365.
[34]
Standardization, I.O.f., ISO 9241-11: Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part 11: Guidance on Usability. 1998.
[35]
Zennaro, S., et al. Performance evaluation of the 1st and 2nd generation Kinect for multimedia applications. in 2015 IEEE International Conference on Multimedia and Expo (ICME). 2015. IEEE.

Cited By

View all
  • (2020)UltraGesture: Fine-Grained Gesture Sensing and RecognitionIEEE Transactions on Mobile Computing10.1109/TMC.2020.3037241(1-1)Online publication date: 2020
  • (2018)UltraGesture: Fine-Grained Gesture Sensing and Recognition2018 15th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)10.1109/SAHCN.2018.8397099(1-9)Online publication date: Jun-2018

Index Terms

  1. UGI: a multi-dimensional ultrasonic-based interaction approach

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OzCHI '16: Proceedings of the 28th Australian Conference on Computer-Human Interaction
    November 2016
    706 pages
    ISBN:9781450346184
    DOI:10.1145/3010915
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • IEEE-SMCS: Systems, Man & Cybernetics Society
    • Australian Comp Soc: Australian Computer Society
    • Data61: Data61, CSIRO
    • ICACHI: International Chinese Association of Computer Human Interaction
    • Infoxchange: Infoxchange
    • HITLab AU: Human Interface Technology Laboratory Australia
    • James Boag: James Boag
    • Tourism Tasmania: Tourism Tasmania
    • HFESA: Human Factors and Ergonomics Society of Australia Inc.
    • IEEEVIC: IEEE Victorian Section
    • UTAS: University of Tasmania, Australia

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 November 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 2D interaction
    2. 3D interaction
    3. human computer interaction
    4. interaction design
    5. ultrasonic sensors

    Qualifiers

    • Research-article

    Conference

    OzCHI '16
    Sponsor:
    • IEEE-SMCS
    • Australian Comp Soc
    • Data61
    • ICACHI
    • Infoxchange
    • HITLab AU
    • James Boag
    • Tourism Tasmania
    • HFESA
    • IEEEVIC
    • UTAS
    OzCHI '16: The 28th Australian Conference on Human-Computer Interaction
    November 29 - December 2, 2016
    Tasmania, Launceston, Australia

    Acceptance Rates

    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)7
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 04 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)UltraGesture: Fine-Grained Gesture Sensing and RecognitionIEEE Transactions on Mobile Computing10.1109/TMC.2020.3037241(1-1)Online publication date: 2020
    • (2018)UltraGesture: Fine-Grained Gesture Sensing and Recognition2018 15th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)10.1109/SAHCN.2018.8397099(1-9)Online publication date: Jun-2018

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media