Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Drumkit simulator from everyday desktop objects

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In this paper, an augmented reality application for drumkit simulation is presented. The system is capable of classifying any percussive sounds produced by the user from an everyday desktop environment, e.g. clapping, snapping, stroking different objects with a pencil, etc., recognizing up to six different classes of drum hits. These different types of user-generated sounds will subsequently be associated to predefined drumkit sounds, resulting in a natural and intuitive audio interface for drummers and percussionists, which only requires a computer with a built-in microphone. A set of audio features and classification techniques are evaluated for the implementation of the aforementioned system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Agostini G, Longari M, Pollastri E (2001) Musical instrument timbres classification with spectral features. In: IEEE fourth workshop on multimedia signal processing, 2001. pp 97–102

  2. Antle A, Droumeva M, Corness G (2008) Playing with the sound maker: do embodied metaphors help children learn? In: Proceedings of the 7th international conference on Interaction design and children. ACM, pp 178–185

  3. Bakanas P, Armitage J, Balmer J, Halpin P, Hudspeth K, Ng K (2012) mConduct: Gesture transmission and reconstruction for distributed performance. In: ECLAP 2012 Conference on information technologies for performing arts, media access and entertainment. Firenze University Press, p 107

  4. Bakker S, van den Hoven E, Antle A (2011) Moso tangibles: evaluating embodied learning. In: Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction. ACM, pp 85–92

  5. Basili R, Serafini A, Stellato A (2004) Classification of musical genre: a machine learning approach. In: Proceedings of ISMIR. Citeseer

  6. Borchers J, Lee E, Samminger W, Mühlhäuser M (2004) Personal orchestra: A real-time audio/video system for interactive conducting. Multimedia Systems 9(5):458–465

    Article  Google Scholar 

  7. Bradshaw D, Ng K (2008) Analyzing a conductor’s gestures with the wiimote. In: Proceedings of EVA London 2008: the International Conference of Electronic Visualisation and the Arts

  8. Breebaart J, Mckinney M (2002) Features for audio classification. In:in Proceedings of the Philips Symposium of Intelligent Algorithms, Eindoven

  9. Brown J (1999) Computer identification of musical instruments using pattern recognition with cepstral coefficients as features. J Acoust Soc Am 105:1933

    Article  Google Scholar 

  10. Castellano G, Bresin R, Camurri A, Volpe G (2007) Expressive control of music and visual media by full-body movement. In:Proceedings of the 7th international conference on new interfaces for musical expression. ACM, pp 390–391

  11. De Dreu M, Van der Wilk A, Poppe E, Kwakkel G, Van Wegen E (2012) Rehabilitation, exercise therapy and music in patients with parkinson’s disease: a meta-analysis of the effects of music-based movement therapy on walking ability, balance and quality of life. Parkinsonism Relat Disord 18:S114—S119

    Article  Google Scholar 

  12. Deng J, Simmermacher C, Cranefield S (2008) A study on feature analysis for musical instrument classification. IEEE Transactions on Systems Man and Cybernetics Part B Cybernetics 38(2):429–438. doi:10.1109/TSMCB.2007.913394

    Article  Google Scholar 

  13. Eronen A (2001) Comparison of features for musical instrument recognition. In: Proceedings IEEE workshop on applications of signal processing to audio and acoustics

  14. Essl G., Rohs M. (2009) Interactivity for mobile music-making. Organised Sound 14(02):197–207

    Article  Google Scholar 

  15. Gillet O, Richard G (2004) Automatic transcription of drum loops. In: IEEE international conference on acoustics, speech, and signal processing, 2004. Proceedings. (ICASSP ’04). vol 4. pp iv–269 – iv–272 vol 4. doi:10.1109/ICASSP.2004.1326815

  16. Gouyon F, Pachet F, Delerue O (2000) On the use of zero-crossing rate for an application of classification of percussive sounds. In: Proceedings of the COST G-6 conference on digital audio effects (DAFX-00)

  17. Gower L, McDowall J (2012) Interactive music video games and children’s musical development. Br J Music Educ 29(01):91–105

    Article  Google Scholar 

  18. Halpern M, Tholander J, Evjen M, Davis S, Ehrlich A, Schustak K, Baumer E, Gay G (2011) Moboogie: creative expression through whole body musical interaction. In:Proceedings of the 2011 annual conference on Human factors in computing systems. ACM, pp 557–560

  19. Herrera P, Yeterian A, Gouyon F (2002) Automatic classification of drum sounds: a comparison of feature selection methods and classification techniques. Music and Artif Intell: 69–80

  20. Herrera-Boyer P, Peeters G, Dubnov S (2003) Automatic classification of musical instrument sounds. J New Music Res 32(1):3–21

    Article  Google Scholar 

  21. Holland S, Bouwer A, Dalgelish M, Hurtig T (2010) Feeling the beat where it counts: fostering multi-limb rhythm skills with the haptic drum kit. In:Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction. ACM, pp 21–28

  22. Höofer A, Hadjakos A, Mühlhäuser M (2009) Gyroscope-Based Conducting Gesture Recognition. In: Proceedings of the international conference on new interfaces for musical expression. pp 175–176 . http://www.nime.org/proceedings/2009/nime2009_175.pdf

  23. Ihara M, Maeda S, Ishii S (2007) Instrument identification in monophonic music using spectral information. In: IEEE international symposium on signal processing and information technology, 2007. pp 595–599

  24. Je H, Kim J, Kim D (2007) Hand gesture recognition to understand musical conducting action. In: The 16th IEEE International Symposium on Robot and Human interactive Communication, 2007. RO-MAN 2007. IEEE, pp 163–168

  25. Jordà S (2010) The reactable: tangible and tabletop music performance. In: Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems. ACM, pp 2989–2994

  26. Khoo E, Merritt T, Fei V, Liu W, Rahaman H, Prasad J, Marsh T (2008) Body music: physical exploration of music theory. In: Proceedings of the 2008 ACM SIGGRAPH symposium on video games. pp. 35–42

  27. Lee E, Nakra T, Borchers J (2004) You’re the conductor: a realistic interactive conducting system for children. In:Proceedings of the 2004 conference on new interfaces for musical expression. National University of Singapore, pp 68–73

  28. Levin G, Lieberman Z (2004) In-situ speech visualization in real-time interactive installation and performance. In: Non-Photorealistic Animation and Rendering: Proceedings of the 3rd international symposium on Non-photorealistic animation and rendering, vol 7. pp 7–14

  29. Livshin AA, Rodet X (2004) Musical instrument identification in continuous recordings. In: Proceedings of DAFX

  30. Mandanici M, Sapir S (2012) Disembodied voices: A kinect virtual choir conductor. http://www.smcnetwork.org/system/files/smc2012-174.pdf, last retrieved 20/09/2012

  31. Morita H, Hashimoto S, Ohteru S (1991) A computer music system that follows a human conductor. Computer 24(7):44–53

    Article  Google Scholar 

  32. Nakra T, Ivanov Y, Smaragdis P, Ault C (2009) The ubs virtual maestro: An interactive conducting system. NIME2009: 250–255

  33. Ng K (2004) Music via motion: transdomain mapping of motion and sound for interactive performances. Proc IEEE 92(4):645–655

    Article  Google Scholar 

  34. Padmavathi G, Shanmugapriya D, Kalaivani M (2010) Acoustic signal based feature extraction for vehicular classification. In: 3rd International conference on advanced computer theory and engineering (ICACTE), 2010 , vol 2. pp V2–11 –V2–14

  35. Parton K, Edwards G (2009) Features of conductor gesture: towards a framework for analysis within interaction. In: The second international conference on music communication science, 3–4 December 2009. Sydney, Australia

    Google Scholar 

  36. Peng L, Gerhard D (2009) A wii-based gestural interface for computer-based conducting systems. In: Proceedings of the 2009 conference on new interfaces for musical expression

  37. Qin Y, A study of wii/kinect controller as musical controllers. url=http://www.music.mcgill.ca/~ying/McGill/MUMT620/Wii-Kinect.pdf, last retrieved 20/09/2012

  38. Rosa-Pujazón A, Barbancho I, Tardón LJ, Barbancho AM (2013) Conducting a virtual ensemble with a kinect device. In: SMAC 2013 - Stockholm music acoustics conference 2013. pp 284–291

  39. Rosa-Pujazón A, Barbancho I, Tardón LJ, Barbancho AM (2013) Drum-hitting gesture recognition and prediction system using kinect. In:I Simposio Espaol de Entrenimiento Digital SEED’13. pp 108–118

  40. Tardón LJ, Sammartino S, Barbancho I (2010) Design of an efficient music-speech discriminator. Acoust Soc Am

  41. Theodoridis S, Koutroumbas K (2008) Pattern Recognition, Fourth Edition, 4th edn. Academic Press

  42. Todoroff T, Leroy J, Picard-Limpens C (2011) Orchestra: Wireless sensor system for augmented performances & fusion with kinect. QPSR of the numediart research program 4(2)

  43. Trail S, Dean M, Tavares T, Odowichuk G, Driessen P, Schloss W, Tzanetakis G (2012) Non-invasive sensing and gesture control for pitched percussion hyper-instruments using the kinect. In: Proceedings of the international conference on new interfaces for musical expression NIME’12

  44. Wang C, Lai A (2011) Development of a mobile rhythm learning system based on digital game-based learning companion. Edutainment Technologies. Educ Game and Virtual Reality/Augmented Real Appl: 92–100

Download references

Acknowledgements

This work has been funded by the Junta de Andalucía under Project No. P11-TIC-7154 and by the Ministerio de Educación, Cultura y Deporte through the Programa Nacional de Movilidad de Recursos Humanos del Plan Nacional de I-D + i 2008-2011, prorrogado por Acuerdo de Consejo de Ministros de 7 de octubre de 2011. The work has been done in the context of Campus de Excelencia Internacional Andalucía Tech, Universidad de Málaga.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ana M. Barbancho.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Herrero, G., Barbancho, I., Tardón, L.J. et al. Drumkit simulator from everyday desktop objects. Multimed Tools Appl 74, 10195–10213 (2015). https://doi.org/10.1007/s11042-014-2159-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-014-2159-z

Keywords