Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2617995.2618013acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article

Gesture in performance with traditional musical instruments and electronics: Use of embodied music cognition and multimodal motion capture to design gestural mapping strategies

Published: 16 June 2014 Publication History
  • Get Citation Alerts
  • Abstract

    This paper describes the implementation of gestural mapping strategies for performance with a traditional musical instrument and electronics. The approach adopted is informed by embodied music cognition and functional categories of musical gestures. Within this framework, gestures are not seen as means of control subordinated to the resulting musical sounds but rather as significant elements contributing to the formation of musical meaning similarly to auditory features. Moreover, the ecological knowledge of the gestural repertoire of the instrument is taken into account as it defines the action-sound relationships between the instrument and the performer and contributes to form expectations in the listeners. Subsequently, mapping strategies from a case study of electric guitar performance will be illustrated describing what motivated the choice of a multimodal motion capture system and how different solutions have been adopted considering both gestural meaning formation and technical constraints.

    References

    [1]
    A. Camurri, B. Mazzarino, and G. Volpe. Analysis of expressive gesture: The eyesweb expressive gesture processing library. In A. Camurri and G. Volpe, editors, Gesture Workshop, volume 2915 of Lecture Notes in Computer Science, pages 460--467. Springer, 2003.
    [2]
    A. Camurri, G. D. Poli, M. Leman, and G. Volpe. A multi-layered conceptual framework for expressive gesture applications. In Proceedings of MOSART: Workshop on Current Directions in Computer Music, pages 29--34, 2001.
    [3]
    A. Camurri, G. Volpe, G. De Poli, and M. Leman. Communicating expressiveness and affect in multimodal interactive systems. MultiMedia, IEEE, 12(1):43--53, Jan 2005.
    [4]
    L. Costalonga. Biomechanical Modeling of Musical Performace: A Case Study of the Guitar. PhD thesis, School of Computing, Communications and Electronics, University of Plymouth, 2009.
    [5]
    A. Cox. The mimetic hypothesis and embodied musical meaning. Musicæ Scientiæ, 5(2), 2001.
    [6]
    S. Dahl, F. Bevilacqua, R. Bresin, M. Clayton, L. Leante, I. Poggi, and N. Rasamimanana. Gesture in performance. In R. I. Godøy and M. Leman, editors, Musical Gestures: Sound, Movement, and Meaning. Routledge, 2010.
    [7]
    A. Di Scipio. 'Sound is the interface': from interactive to ecosystemic signal processing. Organised Sound, 8, 2003.
    [8]
    M. Elmezain, A. Al-Hamadi, and B. Michaelis. Hand gesture spotting based on 3d dynamic features using hidden markov models. In D. Ślęzak, S. Pal, B.-H. Kang, J. Gu, H. Kuroda, and T.-h. Kim, editors, Signal Processing, Image Processing and Pattern Recognition, volume 61 of Communications in Computer and Information Science, pages 9--16. Springer Berlin Heidelberg, 2009.
    [9]
    D. Fenza, L. Mion, S. Canazza, and A. Rodà. Physical movement and musical gestures: a multilevel mapping strategy. In Proceedings of Sound and Music Computing Conference, Salerno, 2005.
    [10]
    G. A. Fink. Markov Models for Pattern Recognition: From Theory to Applications. Springer-Verlag, Berlin Heidelberg, 2008.
    [11]
    J. J. Gibson. The theory of affordances. In Perceiving, Acting, and Knowing. Erlbaum, 1977.
    [12]
    R. I. Godøy. Motor-Mimetic Music Cognition. Leonardo, 36(4):317--319, Aug. 2003.
    [13]
    R. I. Godøy. Gestural affordances of musical sound. In R. I. Godøy and M. Leman, editors, Musical gestures: Sound, movement, and meaning. Routledge, 2010.
    [14]
    R. I. Godøy, A. R. Jensenius, and K. Nymoen. Chunking in music by coarticulation. Acta Acoustica united with Acoustica, 96(4):690--700, 2010.
    [15]
    R. I. Godøy and M. Leman, editors. Musical Gestures: Sound, Movement and Meaning. Routledge, 2010.
    [16]
    A. Gritten and E. King. Music and Gesture. Ashgate, 2006.
    [17]
    A. Gritten and E. King. New Perspectives on Music and Gesture. SEMPRE studies in the psychology of music. Ashgate, 2011.
    [18]
    N. Hirose. An ecological approach to embodiment and cognition. Cognitive Systems Research, 3(3):289--299, 2002.
    [19]
    A. D. Hunt, M. M. Wanderley, and M. Paradis. The importance of Parameter Mapping in Electronic Instrument Design. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 88--93, 2002.
    [20]
    A. R. Jensenius, M. M. Wanderley, R. I. Godøy, and M. Leman. Musical gestures: Concepts and methods in research. In R. I. Godøy and M. Leman, editors, Musical gestures: Sound, movement, and meaning, pages 12--35. Routledge, 2010.
    [21]
    J. M. Kilner and R. N. Lemon. What we know currently about mirror neurons. Current Biology, 23(23):R1057--62, 2013.
    [22]
    O. Lähdeoja, M. M. Wanderley, and J. Malloch. Instrument augmentation using ancillary gestures for subtle sonic effects. In Proceedings of the SMC 2009 - 6th Sound and Music Computing Conference, 2009.
    [23]
    M. Leman. Embodied Music Cognition and Mediation Technology. MIT Press, 2008.
    [24]
    M. Leman. Music, gesture, and the formation of embodied meaning. In R. I. Godøy and M. Leman, editors, Musical Gestures: Sound, Movement and Meaning, pages 126--153. Routledge, 2010.
    [25]
    M. Leman. Musical gestures and embodied cognition. In T. Dutoit, T. Todoroff, and N. d'Alessandro, editors, Actes des Journées d'Informatique Musicale (JIM 2012), pages 5--7, Mons, Belgique, 9-11 mai 2012. UMONS/numediart.
    [26]
    M. Lesaffre, M. Leman, K. Tanghe, B. D. Baets, H. D. Meyer, and J.-P. Martens. User-dependent taxonomy of musical features as a conceptual framework for musical audio-mining technology. In Proceedings of the Stockholm Music Acoustics Conference, pages 635--638, 2003.
    [27]
    P.-J. Maes, M. Leman, K. Kochman, M. Lesaffre, and M. Demey. The "one-person choir": A multidisciplinary approach to the development of an embodied human-computer interface. Computer Music Journal, 35(2):22--35, 2011.
    [28]
    P.-J. Maes, C. Palmer, M. Leman, and M. Wanderley. Action-based effects on music perception. Frontiers in Psychology, 4, 2014.
    [29]
    J. R. Matyja and A. Schiavio. Enactive music cognition: Background and research themes. Constructivist Foundations, 8(3):351--357, 2013.
    [30]
    E. R. Miranda and M. Wanderley. New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series). A-R Editions, Inc., Madison, WI, USA, 2006.
    [31]
    M. Müller. Information Retrieval for Music and Motion. Springer Verlag, 2007.
    [32]
    T. Murray-Browne, D. Mainstone, N. Bryan-Kinns, and M. D. Plumbley. The Medium is the Message: Composing Instruments and Performing Mappings. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 56--59, 2011.
    [33]
    L. Nijs, M. Lesaffre, and M. Leman. The musical instrument as a natural extension of the musician. In M. Castellango and H. Genevois, editors, Proceedings of the 5th Conference of Interdisciplinary Musicology, pages 132--133. LAM-Institut jean Le Rond d'Alembert, 2009.
    [34]
    B. Repp and Y.-H. Su. Sensorimotor synchronization: A review of recent research (2006--2012). Psychonomic Bulletin & Review, 20(3):403--452, 2013.
    [35]
    M. Schutz and S. Lipscomb. Hearing gestures, seeing music: vision influences perceived tone duration. Perception, 36(6):888--97, 2007.
    [36]
    H. Smith and R. Dean. Practice-Led Research, Research-Led Practice in the Creative Arts. Research Methods for the Arts and Humanities. Edinburgh University Press, 2009.
    [37]
    V. Verfaille, M. M. Wanderley, and P. Depalle. Mapping strategies for gestural and adaptive control of digital audio effects. Journal of New Music Research, 35(1):71--93, 2006.
    [38]
    M. Waisvisz. Manager or Musician? About virtuosity in live electronic music. Do we operate our electronic systems or do we play them? In Proceedings of the International Conference on New Interfaces for Musical Expression, page 415, 2006.

    Cited By

    View all
    • (2023)A Fuzzy Gestural AI Approach for Expressive Interactivity in Multitouch Digital Musical Instruments Based on Laban DescriptorsComputer Music Journal10.1162/comj_a_0067347:1(22-43)Online publication date: 13-Jun-2023
    • (2023) Frequencies of gesture : archiving and re-embodying kinesthetic traces through technical inscription into sound and image International Journal of Performance Arts and Digital Media10.1080/14794713.2023.2242066(1-25)Online publication date: 8-Aug-2023
    • (2023)A Phenomenological Approach to Wearable Technologies and Viscerality: From embodied interaction to biophysical music performanceOrganised Sound10.1017/S1355771823000286(1-15)Online publication date: 16-Nov-2023
    • Show More Cited By

    Index Terms

    1. Gesture in performance with traditional musical instruments and electronics: Use of embodied music cognition and multimodal motion capture to design gestural mapping strategies

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        MOCO '14: Proceedings of the 2014 International Workshop on Movement and Computing
        June 2014
        184 pages
        ISBN:9781450328142
        DOI:10.1145/2617995
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        In-Cooperation

        • SFU: Simon Fraser University

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 16 June 2014

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Gesture
        2. embodied music cognition
        3. expressiveness
        4. guitar
        5. mapping
        6. motion capture
        7. multimodal

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        MOCO '14

        Acceptance Rates

        MOCO '14 Paper Acceptance Rate 24 of 54 submissions, 44%;
        Overall Acceptance Rate 85 of 185 submissions, 46%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)51
        • Downloads (Last 6 weeks)4

        Other Metrics

        Citations

        Cited By

        View all
        • (2023)A Fuzzy Gestural AI Approach for Expressive Interactivity in Multitouch Digital Musical Instruments Based on Laban DescriptorsComputer Music Journal10.1162/comj_a_0067347:1(22-43)Online publication date: 13-Jun-2023
        • (2023) Frequencies of gesture : archiving and re-embodying kinesthetic traces through technical inscription into sound and image International Journal of Performance Arts and Digital Media10.1080/14794713.2023.2242066(1-25)Online publication date: 8-Aug-2023
        • (2023)A Phenomenological Approach to Wearable Technologies and Viscerality: From embodied interaction to biophysical music performanceOrganised Sound10.1017/S1355771823000286(1-15)Online publication date: 16-Nov-2023
        • (2022)Performers’ Use of Space and Body in Movement Interaction with A Movement-based Digital Musical InstrumentProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537976(1-12)Online publication date: 22-Jun-2022
        • (2021)Interactive Machine Learning of Musical GestureHandbook of Artificial Intelligence for Music10.1007/978-3-030-72116-9_27(771-798)Online publication date: 3-Jul-2021
        • (2020)Evaluation of Inertial Sensor Data by a Comparison with Optical Motion Capture Data of Guitar Strumming GesturesSensors10.3390/s2019572220:19(5722)Online publication date: 8-Oct-2020
        • (2019)Bewegungssonifikation: Psychologische Grundlagen und Auswirkungen der Verklanglichung menschlicher Handlungen in der Rehabilitation, im Sport und bei MusikaufführungenJahrbuch Musikpsychologie10.5964/jbdgm.2018v28.3628Online publication date: 28-May-2019
        • (2019)Instruments of ArticulationProceedings of the 6th International Conference on Movement and Computing10.1145/3347122.3347133(1-8)Online publication date: 10-Oct-2019
        • (2018)BeaconProceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3173225.3173312(586-591)Online publication date: 18-Mar-2018
        • (2018)Shared periodic performer movements coordinate interactions in duo improvisationsRoyal Society Open Science10.1098/rsos.1715205:2(171520)Online publication date: 21-Feb-2018
        • Show More Cited By

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media