Abstract
In this paper we study key estimation in electronic dance music, an umbrella term referring to a variety of electronic music subgenres intended for dancing at nightclubs and raves. We start by defining notions of tonality and key before outlining the basic architecture of a template-based key estimation method. Then, we report on the tonal characteristics of electronic dance music, in order to infer possible modifications of the method described. We create new key profiles combining these observations with corpus analysis, and add two pre-processing stages to the basic algorithm. We conclude by comparing our profiles to existing ones, and testing our modifications on independent datasets of pop and electronic dance music, observing interesting improvements in the performance or our algorithms, and suggesting paths for future research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
We take this term from Tagg [26] to refer to European Classical Music of the so-called common practice repertoire, on which most treatises on harmony are based.
- 2.
The Music Information Retrieval Evaluation eXchange (MIREX) is an international committee born to evaluate advances in Music Information Retrieval among different research centres, by quantitatively comparing algorithm performance using test sets that are not available beforehand to participants.
- 3.
- 4.
- 5.
- 6.
- 7.
After informal testing, we decided to use the following settings in all the experiments reported: mix-down to mono; sampling rate: 44,100 Hz.; window size: 4,096 hanning; hop size: 16,384; frequency range: 25–3,500 Hz.; PCP size: 36 bins; weighting size: 1 semitone; similarity: cosine distance.
- 8.
- 9.
References
Bogdanov, D., Wack, N., Gómez, E., Gulati, S., Herrera, P., Mayor, O.: ESSENTIA: an open-source library for sound and music analysis. In: Proceedings 21st ACM-ICM, pp. 855–858 (2013)
Cannam, C., Mauch, M., Davies, M.: MIREX 2013 Entry: Vamp plugins from the Centre For Digital Music (2013). www.music-ir.org
Everett, W.: Making sense of rock’s tonal systems. Music Theory Online, vol. 10(4) (2004)
Dayal, G., Ferrigno, E.: Electronic Dance Music. Grove Music Online. Oxford University Press, Oxford (2012)
Dressler, K., Streich, S.: Tuning frequency estimation using circular statistics. In: Proceedings of the 8th ISMIR, pp. 2–5 (2007)
Gómez, E.: Tonal description of polyphonic audio for music content processing. INFORMS J. Comput. 18(3), 294–304 (2006)
Gómez, E.: Tonal description of music audio signals. Ph.D. thesis, Universitat Pompeu Fabra, Barcelona (2006)
Harte., C.: Towards automatic extraction of harmony information from music signals. Ph.D. thesis, Queen Mary University of London (2010)
Hyer, B.: Tonality. Grove Music Online. Oxford University Press, Oxford (2012)
James, R.: My life would suck without you / Where have you been all my life: Tension-and-release structures in tonal rock and non-tonal EDM pop. www.its-her-factory.com/2012/07/my-life-would-suck-without-youwhere-have-you-been-all-my-life-tension-and-release-structures-in-tonal-rock-and-non-tonal-edm-pop. Accessed 16th December 2014
Klapuri, A.: Multipitch analysis of polyphonic music and speech signals using an auditory model. IEEE Trans. Audio Speech Lang. Process. 16(2), 255–266 (2008)
Knees, P., Faraldo, Á., Herrera, P., Vogl, R., Böck, S., Hörschläger, F., Le Goff, M.: Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections. In: Proceeings of the 16th ISMIR (2015)
Krumhansl, C.L.: Cognitive Foundations of Musical Pitch. Oxford Unversity Press, New York (1990)
Mauch, M., Dixon., S.: Approximate note transcription for the improvedidentification of difficult chords. In: Proceedings of the 11th ISMIR, pp. 135–140 (2010)
Mauch, M., Cannam, C., Davies, M., Dixon, S., Harte, C., Kolozali, S., Tidjar, D.: OMRAS2 metadata project 2009. In: Proceedings of the 10th ISMIR, Late-Breaking Session (2009)
Moore, A.: The so-called “flattened seventh” in rock. Pop. Music 14(2), 185–201 (1995)
Müller, M., Ewert, S.: Towards timbre-invariant audio features for harmony-based music. IEEE Trans. Audio Speech Lang. Process. 18(3), 649–662 (2010)
Noland, K.: Computational Tonality estimation: Signal Processing and Hidden Markov Models. Ph.D. thesis, Queen Mary University of London (2009)
Noland, K., Sandler, M.: Signal processing parameters for tonality estimation. In: Proceedings of the 122nd Convention Audio Engeneering Society (2007)
Pollack., A.W.: Notes on..series. (Accessed: 1 February 2015). www.icce.rug.nl/ soundscapes/DATABASES/AWP/awp-notes_on.shtml
Saslaw, J.: Modulation (i). Grove Music Online. Oxford University Press, Oxford (2012)
Schellenberg, E.G., von Scheve, C.: Emotional cues in American popular music: five decades of the Top 40. Psychol. Aesthetics Creativity Arts 6(3), 196–203 (2012)
Sha’ath., I.: Estimation of key in digital music recordings. In: Departments of Computer Science & Information Systems, Birkbeck College, University of London (2011)
Spicer, M.: (Ac)cumulative form in pop-rock music. Twentieth Century Music 1(1), 29–64 (2004)
Tagg, P.: From refrain to rave: the decline of figure and raise of ground. Pop. Music 13(2), 209–222 (1994)
Tagg., P.: Everyday tonality II (Towards a tonal theory of what most people hear). The Mass Media Music Scholars’ Press. New York and Huddersfield (2014)
Temperley, D.: What’s key for key? The Krumhansl-Schmuckler key-finding algorithm reconsidered. Music Percept. Interdiscip. J. 17(1), 65–100 (1999)
Röbel, A., Rodet, X.: Efficient spectral envelope estimation and its application to pitch shifting and envelope preservation. In: Proceedings of the 8th DAFX (2005)
Zhu, Y., Kankanhalli, M.S., Gao., S.: Music key detection for musical audio. In: Proceedings of the 11th IMMC, pp. 30–37 (2005)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Faraldo, Á., Gómez, E., Jordà, S., Herrera, P. (2016). Key Estimation in Electronic Dance Music. In: Ferro, N., et al. Advances in Information Retrieval. ECIR 2016. Lecture Notes in Computer Science(), vol 9626. Springer, Cham. https://doi.org/10.1007/978-3-319-30671-1_25
Download citation
DOI: https://doi.org/10.1007/978-3-319-30671-1_25
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30670-4
Online ISBN: 978-3-319-30671-1
eBook Packages: Computer ScienceComputer Science (R0)