Abstract
We investigate the use of musical prosody as a coordination strategy in ensemble performance, focussing on the metrically ambiguous case of long sustained notes, where little rhythmic information can be obtained from the sparse note onsets. Using cluster analysis of the amplitude power curves of long notes in recordings of a violin-cello duo, we examine the use of varying intensity to communicate timing information over the duration of these sustained notes. The elbow method provides the optimal number of clusters, and we present the common intensity shapes employed by the violinist and by the cellist. Analysis of peaks in the intensity curves uncovers correspondences between peak positions and natural subdivisions of musical time: performers tend to use consistent curve shapes that peak at metrical subdivisions within notes, namely, at around the 0.25 and 0.75 points of the note event. We hypothesize that the 0.75 point intensity peak functions as an upbeat indicator and the 0.25 point peak serves to propel the note forward. We surmise that knowledge of the placements of these intensity peaks may be useful as auditory cues for marking musical time in held notes, and discuss how this knowledge might be exploited in anticipatory accompaniment systems and audio-score alignment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cannam, C., Landone, C., Sandler, M.: Sonic visualiser: An open source application for viewing, analysing, and annotating music audio files. In: Proceedings of the ACM Multimedia 2010 International Conference (2010)
Dixon, S.: Live tracking of musical performances using on-line time warping. In: Proceedings of the 8th International Conference on Digital Audio Effects (DAFx05) (2005)
Godoy, R.I., Leman, M.: Musical Gestures Sound, Movement and Meaning. Routledge, New York (2010)
Keller, P.E., Appel, M.: Individual differences, auditory imagery, and the coordination of body movements and sounds in musical ensembles. Music Percept. Interdisc. J. 28, 27–46 (2010)
Levine, M., Schober, M.: Visual and auditory cues in jazz musicians. In: International Symposium on Performance Science (2011)
Lim, A.: Robot musical accompaniment: Real time synchronisation using visual cue recognition. In: Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent RObots and Systems (2010)
MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability (1967)
McCaleb, M.: Communication or interaction? applied environmental knowledge in ensemble performance. In: Proceedings of the CMPCP Performance Studies Network International Conference (2011)
Ng, A.: Clustering with the k-means algorithm. http://calypso.inesc-id.pt/docs/KM1.pdf
Ritter, M., Hamel, K., Pritchard, B.: Integrated multimodal score-following environment. In: Proceedings of the International Computer Music Conference (2013)
Stowell, D., Chew, E.: Bayesian map estimation of piecewise arcs in tempo time-series. In: Proceedings of the International Symposium on Computer Music Modeling and Retrieval (2012)
Vera, B., Chew, E., Healey, P.G.T.: A study of ensemble synchronisation under restricted line of sight. In: Proceedings of the 14th International Society for Music Information Retrieval Conference (2013)
Vines, B.W., Wanderley, M.M., Krumhansl, C.L., Nuzzo, R.L., Levitin, D.J.: Performance gestures of musicians: what structural and emotional information do they convey? In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 468–478. Springer, Heidelberg (2004)
Williamon, A.: Coordinating duo piano performance. In: Proceedings of the Sixth International Conference on Music Perception and Cognition (2000)
Yang, L., Rajab, K., Chew, E.: Vibrato performance style: A case study comparing erhu and violin. In: Proceedings of the International Conference on Computer Music Modeling and Retrieval (2013)
Yoder, N.: Peakfinder. www.mathworks.co.uk/matlabcentral/fileexchange/25500-peakfinder
Acknowledgments
The authors thank Kathleen Agres and Laurel Pardue for their participation in the line of sight ensemble interaction experiments. This research was funded in part by the Engineering and Physical Sciences Research Council (EPSRC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Vera, B., Chew, E. (2014). Intensity Shaping in Sustained Notes Encodes Metrical Cues for Synchronization in Ensemble Performance. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-12976-1_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12975-4
Online ISBN: 978-3-319-12976-1
eBook Packages: Computer ScienceComputer Science (R0)