Abstract
In this paper we describe the SAME networked platform for context-aware, experience-centric mobile music applications, and we present an implementation of the SAME active music listening paradigm: the Mobile Conductor. It allows the user to express herself in conducting a virtual ensemble playing a MIDI piece of music by means of her mobile phone. The mobile phone detects the user’s hand movement and molds the music performance style by modulating its speed, volume, and intonation.
Similar content being viewed by others
References
Anttila A (2006) Sonicpulse: exploring a shared music space. In: 3rd international workshop on mobile music technology
Ball G, Breese J (2000) Emotion and personality in a conversational agent. In: Cassell J, Sullivan J, Prevost S, Churchill E (eds) Embodied conversational characters. MIT Press, Cambridge
Camurri A (1995) Interactive dance/music systems. In: Proceedings of international computer music conference
Camurri A, Canepa C, Coletta P, Mazzarino B, Volpe G (2007) Mappe per affetti erranti: a multimodal system for social active listening and expressive performance. In: Proceedings of the 8th international conference on new interfaces for musical expression
Camurri A, Canepa C, Volpe G (2007) Active listening to a virtual orchestra through an expressive gestural interface: the orchestra explorer. In: Proceedings of the 7th international conference on new interfaces for musical expression
Camurri A, Mazzarino B, Volpe G (2004) Analysis of expressive gesture: the eyesweb expressive gesture processing library. Lecture notes in computer science
Camurri A, Volpe G, Vinet H, Bresin R, Maestre E, Llop J, Kleimola J, Valimaki S, Seppanen J (2009) User-centric context-aware mobile applications for embodied music listening. In: Proceedings of the 1st international ICST conference on user centric media
Castellano G, Bresin R, Camurri A, Volpe G (2007) Expressive control of music and visual media by full-body movement. In: Proceedings of the 7th international conference on New interfaces for musical expression, pp 390–391
EyesWeb: http://www.eyesweb.org
Gallaher PE (1992) Individual differences in nonverbal behavior: dimensions of style. J Pers Soc Psychol 63(1):133–145
Gaye L, Mazé R, Holmquist L (2003) Sonic city: the urban environment as a musical interface. In: Proceedings of the 3rd international conference on new interfaces for musical expression
Goto M (2007) Active music listening interfaces based on signal processing. In: Proceedings of the 2007 IEEE international conference on acoustics, speech, and signal processing
Johansson G (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211
Laban R, Lawrence FC (1947) Effort. Macdonald & Evans, USA
Leman M, Demey M, Lesaffre M, van Noorden L, Moelants D (2009) Concepts, technology and assessment of the social music game sync-in team. In: Proceedings of the 12th IEEE international conference on computational science and engineering
Mancini M, Bresin R, Pelachaud C (2007) A virtual head driven by music expressivity. IEEE Trans Audio Speech Lang Process 15(6):1833–1841
Östergren M, Juhlin O (2004) Sound pryer: truly mobile joint listening. In: 1st international workshop on mobile music technology
Paiva A, Andersson G, Höök K, Mourao D, Costa M, Martinho C (2002) Sentoy in fantasya: designing an affective sympathetic interface to a computer game. Pers Ubiquitous Comput 6(5-6):378–389
Pollick FE (2004) The features people use to recognize human movement style. In: Camurri A, Volpe G (eds) Gesture-based communication in human-computer interaction-gesture workshop 2003. LNAI, vol 2915. Springer, Berlin, pp 10–19
PureData: http://puredata.info
Rohs M, Essl G (2007) Camus2-collaborative music performance with mobile camera phones. In: Proceedings of the international conference on advances in computer entertainment technology (ACE)
Rohs M, Essl G, Roth M (2006) Camus: live music performance using camera phones and visual grid tracking. In: NIME ’06: proceedings of the 2006 conference on new interfaces for musical expression. IRCAM, Paris, pp 31–36
vvvv: http://vvvv.org
Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896
Wallbott HG, Scherer KR (1986) Cues and channels in emotion recognition. J Pers Soc Psychol 51(4):690–699
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Mancini, M., Varni, G., Kleimola, J. et al. Human movement expressivity for mobile active music listening. J Multimodal User Interfaces 4, 27–35 (2010). https://doi.org/10.1007/s12193-010-0047-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-010-0047-z