Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Adaptive Gesture Recognition with Variation Estimation for Interactive Systems

Published: 19 December 2014 Publication History
  • Get Citation Alerts
  • Abstract

    This article presents a gesture recognition/adaptation system for human--computer interaction applications that goes beyond activity classification and that, as a complement to gesture labeling, characterizes the movement execution. We describe a template-based recognition method that simultaneously aligns the input gesture to the templates using a Sequential Monte Carlo inference technique. Contrary to standard template-based methods based on dynamic programming, such as Dynamic Time Warping, the algorithm has an adaptation process that tracks gesture variation in real time. The method continuously updates, during execution of the gesture, the estimated parameters and recognition results, which offers key advantages for continuous human--machine interaction. The technique is evaluated in several different ways: Recognition and early recognition are evaluated on 2D onscreen pen gestures; adaptation is assessed on synthetic data; and both early recognition and adaptation are evaluated in a user study involving 3D free-space gestures. The method is robust to noise, and successfully adapts to parameter variation. Moreover, it performs recognition as well as or better than nonadapting offline template-based methods.

    References

    [1]
    M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp. 2002. A tutorial on particle filters for online nonlinear/non-Gaussian bayesian tracking. IEEE Transactions on Signal Processing 50, 2, 174--188.
    [2]
    S. Barrass and G. Kramer. 1999. Using sonification. Multimedia Systems 7, 1, 23--31.
    [3]
    O. Bau and W. E. Mackay. 2008. OctoPocus: a dynamic guide for learning gesture-based command sets. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology. ACM, 37--46.
    [4]
    F. Bevilacqua, F. Baschet, and S. Lemouton. 2012. The augmented string quartet: Experiments and gesture following. Journal of New Music Research 41, 1, 103--119.
    [5]
    F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, and N. Leroy. 2007. Wireless sensor interface and gesture-follower for music pedagogy. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression. ACM, 124--129.
    [6]
    F. Bevilacqua, N. Schnell, and S. Fdili Alaoui. 2011a. Gesture capture: Paradigms in interactive music/dance systems. In Emerging Bodies: The Performance of Worldmaking in Dance and Choreography, G. Klein, S. Noeth (Eds.), Transaction Publishers, New Brunswick, NJ, 183--194.
    [7]
    F. Bevilacqua, N. Schnell, N. Rasamimanana, B. Zamborlin, and F. Guédy. 2011b. Online gesture analysis and control of audio processing. In Musical Robots and Interactive Multimodal Systems, J. Solis, K Ng (Eds), Springer-Verlag, Berlin, 127--142.
    [8]
    F. Bevilacqua, B. Zamborlin, A. Sypniewski, N. Schnell, F. Guédy, and N. Rasamimanana. 2010. Continuous realtime gesture following and recognition. In Embodied Communication and Human-Computer Interaction, Volume 5934 of Lecture Notes in Computer Science. Springer, Berlin, 73--84.
    [9]
    J. Bilmes. 2002. What HMMs Can Do. Technical Report. University of Washington, Department of Electrical Engineering, Seattle, WA.
    [10]
    M. Black and A. Jepson. 1998a. A probabilistic framework for matching temporal trajectories: Condensation-based recognition of gestures and expressions. In Proceedings of the European Conference on Computer Vision (ECCV'98). Springer, 909--924.
    [11]
    M. J. Black and A. D. Jepson. 1998b. Recognizing temporal trajectories using the condensation algorithm. In Proceedings of the 3rd IEEE International Conference on Automatic Face and Gesture Recognition. IEEE, 16--21.
    [12]
    A. F. Bobick and A. D. Wilson. 1997. A state-based approach to the representation and recognition of gesture. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 12 (1997), 1325--1337.
    [13]
    E. O. Boyer, B. M. Babayan, F. Bevilacqua, M. Noisternig, O. Warusfel, A. Roby-Brami, S. Hanneton, and I. Viaud-Delmon. 2013. From ear to hand: The role of the auditory-motor loop in pointing to an auditory source. Frontiers in Computational Neuroscience 7.
    [14]
    M. Brand and A. Hertzmann. 2000. Style machines. In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques. ACM Press/Addison-Wesley Publishing Co., 183-- 192.
    [15]
    L. Bretzner, I. Laptev, and T. Lindeberg. 2002. Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering. In Proceedings of the 5th IEEE International Conference on Automatic Face and Gesture Recognition. IEEE, 423--428.
    [16]
    B. Caramiaux. 2012. Studies on the Relationship between Gesture and Sound in Musical Performance. Ph.D. Dissertation. University Paris VI, IRCAM Centre Pompidou.
    [17]
    B. Caramiaux, F. Bevilacqua, and N. Schnell. 2010. Analysing gesture and sound similarities with a HMM-based divergence measure. In Proceedings of the 6th Sound and Music Conference. Barcelona, Spain.
    [18]
    B. Caramiaux, F. Bevilacqua, and A. Tanaka. 2013. Beyond recognition: Using gesture variation for continuous interaction. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Extended Abstracts, Alt.CHI. ACM, New York, NY, 2109--2118.
    [19]
    G. Caridakis, K. Karpouzis, N. Drosopoulos, and S. Kollias. 2009. Adaptive gesture recognition in human computer interaction. In Proceedings of the 10th Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS'09). IEEE, 270--274.
    [20]
    R. Chavarriaga, H. Bayati, and J. D. Millán. 2013. Unsupervised adaptation for acceleration-based activity recognition: Robustness to sensor displacement and rotation. Personal and Ubiquitous Computing 17, 3, 479--490.
    [21]
    R. Douc and O. Cappé. 2005. Comparison of resampling schemes for particle filtering. In Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis (ISPA'05). IEEE, 64--69.
    [22]
    A. Doucet, N. De Freitas, and N. Gordon. 2001. Sequential Monte Carlo Methods in Practice. Springer Verlag.
    [23]
    P. Dourish. 2004. Where the Action Is: The Foundations of Embodied Interaction. MIT Press, Cambridge, MA.
    [24]
    K. Forbes and E. Fiume. 2005. An efficient search algorithm for motion data using weighted PCA. In Proceedings of the 2005 ACM SIGGRAPH/Eurographics Symposium on Computer Animation. ACM, New York, NY, 67--76.
    [25]
    D. M. Gavrila and L. S. Davis. 1995. Towards 3-D model-based tracking and recognition of human movement: A multi-view approach. In Proceedings of the International Workshop on Automatic Face and Gesture recognition. Citeseer, 272--277.
    [26]
    A. Heloir, N. Courty, S. Gibet, and F. Multon. 2006. Temporal alignment of communicative gesture sequences. Computer Animation and Virtual Worlds 17, 3--4, 347--357.
    [27]
    O. Höner and T. Hermann. 2005. Listen to the ball! Sonification-based sport games for people with visual impairment. In Proceedings of the 15th International Symposium Adapted Physical Activity.
    [28]
    M. Isard and A. Blake. 1998. Condensation conditional density propagation for visual tracking. International Journal of Computer Vision 29, 1, 5--28.
    [29]
    S. Jordà. 2008. On stage: The reactable and other musical tangibles go real. International Journal of Arts and Technology 1, 3, 268--287.
    [30]
    A. Licsár and T. Szirányi. 2005. User-adaptive hand gesture recognition system with interactive training. Image and Vision Computing 23, 12, 1102--1114.
    [31]
    J. Liu, L. Zhong, J. Wickramasuriya, and V. Vasudevan. 2009. Uwave: Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing 5, 6, 657--675.
    [32]
    D. Merrill and J. A. Paradiso. 2005. Personalization, expressivity, and learnability of an implicit mapping strategy for physical interfaces. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Extended Abstracts. 2152--2161.
    [33]
    S. Mitra and T. Acharya. 2007. Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 37, 3, 311--324.
    [34]
    L. R. Rabiner. 1989. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 257--286.
    [35]
    N. Rasamimanana and F. Bevilacqua. 2012. Urban musical game. In Proceedings of the 2012 Annual Conference on Human Factors in Computing Systems (CHI'12).
    [36]
    N. Rasamimanana, F. Bevilacqua, N. Schnell, F. Guedy, E. Flety, C. Maestracci, B. Zamborlin, J. L. Frechin, and U. Petrevski. 2011. Modular musical objects towards embodied control of digital music. In Proceedings of the 5th International Conference on Tangible, Embedded, and Embodied Interaction. ACM, New York, NY, 9--12.
    [37]
    D. Rocchesso, P. Polotti, and S. Delle Monache. 2009. Designing continuous sonic interaction. International Journal of Design 3, 3.
    [38]
    D. Rubine. 1991. Specifying gestures by example. In Proceedings of the 18th Annual Conference on Computer Graphics and Interactive Techniques. ACM, New York, NY, 329--337.
    [39]
    C. Shan, T. Tan, and Y. Wei. 2007. Real-time hand tracking using a mean shift embedded particle filter. Pattern Recognition 40, 7, 1958--1970.
    [40]
    B. Verplank, C. Sapp, and M. Mathews. 2001. A course on controllers. In Proceedings of the Workshop at the ACM Conference on Computer-Human Interaction (CHI'2001) on New Interfaces for Musical Expression (NIME).
    [41]
    Y. Visell and J. Cooperstock. 2007. Enabling gestural interaction by means of tracking dynamical systems models and assistive feedback. In IEEE International Conference on Systems, Man and Cybernetics. IEEE, 3373--3378.
    [42]
    P. Viviani and T. Flash. 1995. Minimum-jerk, two-thirds power law, and isochrony: Converging approaches to movement planning. Journal of Experimental Psychology: Human Perception and Performance 21, 1, 32.
    [43]
    Z. Wei, T. Tao, D. ZhuoShu, and E. Zio. 2013. A dynamic particle filter-support vector regression method for reliability prediction. Reliability Engineering & System Safety 119, 109--116.
    [44]
    D. Weinland, R. Ronfard, and E. Boyer. 2011. A survey of vision-based methods for action representation, segmentation and recognition. Computer Vision and Image Understanding 115, 2, 224--241.
    [45]
    A. D. Wilson. 2000. Adaptive Models for Gesture Recognition. Ph.D. Dissertation. Massachusetts Institute of Technology, Cambridge MA.
    [46]
    A. D. Wilson and A. F. Bobick. 1999. Parametric hidden Markov models for gesture recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 21, 9, 884--900.
    [47]
    A. D. Wilson and A. F. Bobick. 2000. Realtime online adaptive gesture recognition. In Proceedings of the 15th International Conference on Pattern Recognition (ICPR), Vol. 1. IEEE, 270--275.
    [48]
    J. O. Wobbrock, A. D. Wilson, and Y. Li. 2007. Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, 159--168.
    [49]
    Y. Yacoob and M. J. Black. 1998. Parameterized modeling and recognition of activities. In Proceedings of the 6th International Conference on Computer Vision. IEEE, 120--127.
    [50]
    B. Zamborlin, F. Bevilacqua, M. Gillies, and M. d'Inverno. 2014. Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces. ACM Transactions on Interactive Intelligent Systems (TiiS) 3, 4, 22.
    [51]
    S. K. Zhou, R. Chellappa, and B. Moghaddam. 2004. Visual tracking and recognition using appearance-adaptive models in particle filters. IEEE Transactions on Image Processing 13, 11, 1491--1506.

    Cited By

    View all
    • (2024)Hands-On Robotics: Enabling Communication Through Direct Gesture ControlCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640635(822-827)Online publication date: 11-Mar-2024
    • (2024)Explainable AI in human motionPattern Recognition10.1016/j.patcog.2024.110418151:COnline publication date: 1-Jul-2024
    • (2024)A Grammar of Expressive Conducting GesturesSonic Design10.1007/978-3-031-57892-2_5(67-91)Online publication date: 10-May-2024
    • Show More Cited By

    Index Terms

    1. Adaptive Gesture Recognition with Variation Estimation for Interactive Systems

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Transactions on Interactive Intelligent Systems
        ACM Transactions on Interactive Intelligent Systems  Volume 4, Issue 4
        Special Issue on Activity Recognition for Interaction and Regular Article
        January 2015
        190 pages
        ISSN:2160-6455
        EISSN:2160-6463
        DOI:10.1145/2688469
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 19 December 2014
        Accepted: 01 July 2014
        Received: 01 August 2013
        Published in TIIS Volume 4, Issue 4

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Gesture recognition
        2. adaptive decoding
        3. continuous gesture modeling
        4. gesture analysis
        5. particle filtering
        6. real time

        Qualifiers

        • Research-article
        • Research
        • Refereed

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)46
        • Downloads (Last 6 weeks)3

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Hands-On Robotics: Enabling Communication Through Direct Gesture ControlCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640635(822-827)Online publication date: 11-Mar-2024
        • (2024)Explainable AI in human motionPattern Recognition10.1016/j.patcog.2024.110418151:COnline publication date: 1-Jul-2024
        • (2024)A Grammar of Expressive Conducting GesturesSonic Design10.1007/978-3-031-57892-2_5(67-91)Online publication date: 10-May-2024
        • (2023)An Expressivity-Complexity Tradeoff?: User-Defined Gestures from the Wheelchair Space are Mostly DeicticExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585695(1-8)Online publication date: 19-Apr-2023
        • (2023)Understanding Wheelchair Users’ Preferences for On-Body, In-Air, and On-Wheelchair GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580929(1-16)Online publication date: 19-Apr-2023
        • (2023)Gesture-Based Human–Machine Interaction: Taxonomy, Problem Definition, and AnalysisIEEE Transactions on Cybernetics10.1109/TCYB.2021.312911953:1(497-513)Online publication date: Jan-2023
        • (2023)Air Drums, and Bass: Anticipating Musical Gestures in Accelerometer Signals with a Lightweight CNN2023 IEEE 33rd International Workshop on Machine Learning for Signal Processing (MLSP)10.1109/MLSP55844.2023.10285920(1-5)Online publication date: 17-Sep-2023
        • (2022)Democratizing access to collaborative music making over the network using air instrumentsProceedings of the 17th International Audio Mostly Conference10.1145/3561212.3561227(211-218)Online publication date: 6-Sep-2022
        • (2022)Enabling Hand Gesture Customization on Wrist-Worn DevicesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501904(1-19)Online publication date: 29-Apr-2022
        • (2021)Stochastic-Biomechanic Modeling and Recognition of Human Movement Primitives, in Industry, Using WearablesSensors10.3390/s2107249721:7(2497)Online publication date: 3-Apr-2021
        • Show More Cited By

        View Options

        Get Access

        Login options

        Full Access

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media