Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-3-642-41248-6_13guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Music Emotion Recognition: From Content- to Context-Based Models

Published: 19 June 2012 Publication History

Abstract

The striking ability of music to elicit emotions assures its prominent status in human culture and every day life. Music is often enjoyed and sought for its ability to induce or convey emotions, which may manifest in anything from a slight variation in mood, to changes in our physical condition and actions. Consequently, research on how we might associate musical pieces with emotions and, more generally, how music brings about an emotional response is attracting ever increasing attention. First, this paper provides a thorough review of studies on the relation of music and emotions from different disciplines. We then propose new insights to enhance automated music emotion recognition models using recent results from psychology, musicology, affective computing, semantic technologies and music information retrieval.

References

[1]
Ali, S.O., Peynirciogu, Z.F.: Songs and emotions: are lyrics and melodies equal partners? Psychology of Music 344, 511---534 2006
[2]
Arnold, M.B.: Emotion and personality. Columbia University Press, New York 1960
[3]
Asmus, E.P.: Nine affective dimensions. Tech. rep., University of Miami 1986
[4]
Aucouturier, J.J., Pachet, F.: Improving timbre similarity: How high is the sky? Journal of Negative Results in Speech and Audio Sciences 11 2004
[5]
Banse, R., Scherer, K.R.: Acoustic profiles in vocal emotion expression. Journal of Personality and Social Psychology 70, 614---636 1996
[6]
Barrington, L., Turnbull, D., Yazdani, M., Lanckriet, G.: Combining audio content and social context for semantic music discovery. In: Proc. of the ACM Special Interest Group on Information Retrieval, SIGIR 2009
[7]
Berenzweig, A., Logan, B., Ellis, D., Whitman, B.: A large-scale evaluation of acoustic and subjective music-similarity measures. Computer Music Journal 282, 63---76 2004
[8]
Bischoff, K., Firan, C.S., Nejdl, W., Paiu, R.: Can all tags be used for search? In: Proc. of the ACM Conference on Information and Knowledge Management CIKM, pp. 193---202 2008
[9]
Bischoff, K., Firan, C.S., Paiu, R., Nejdl, W., Laurier, C., Sordo, M.: Music mood and theme classification - a hybrid approach. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 657---662 2011
[10]
Castellano, G., Caridakis, G., Camurri, A., Karpouzis, K., Volpe, G., Kollias, S.: Body gesture and facial expression analysis for automatic affect recognition. In: Scherer, K.R., Bänziger, T., Roesch, E.B. eds. Blueprint for Affective Computing: A Sourcebook, pp. 245---255. Oxford University Press, New York 2010
[11]
Chion, M.: Audio-Vision: Sound On Screen. Columbia University Press 1994
[12]
Cohen, A.J.: Music as a source of emotion in film. In: Music and Emotion Theory and Research, pp. 249---272. Oxford University Press 2001
[13]
Cowie, R., McKeown, G., Douglas-Cowie, E.: Tracing emotion: an overview. International Journal of Synthetic Emotions 31, 1---17 2012
[14]
Dang, T.T., Shirai, K.: Machine learning approaches for mood classification of songs toward music search engine. In: Proc. of the International Conference on Knowledge and Systems Engineering ICKSE, pp. 144---149 2009
[15]
Darwin, C.: The expression of the emotions in man and animals, 3rd edn. Harper-Collins 1998 original work published 1872
[16]
Davies, S., Allen, P., Mann, M., Cox, T.: Musical moods: a mass participation experiment for affective classification of music. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 741---746 2011
[17]
Eerola, T.: A comparison of the discrete and dimensional models of emotion in music. Psychology of Music 391, 18---49 2010
[18]
Eerola, T., Lartillot, O., Toiviainen, P.: Prediction of multidimensional emotional ratings in music from audio using multivariate regression models. In: Proc. of the International Society for Music Information Retrieval ISMIR Conference 2009
[19]
Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psychologists Press, Palo Alto 1978
[20]
Essid, S., Richard, G., David, B.: Musical instrument recognition by pairwise classification strategies. IEEE Trans. on Audio, Speech, and Language Proc. 144, 1401---1412 2006
[21]
Feng, Y., Zhuang, Y., Pan, Y.: Popular music retrieval by detecting mood. In: Proc. ACM SIGIR, pp. 375---376 2003
[22]
Fontaine, J.R., Scherer, K.R., Roesch, E.B., Ellsworth, P.: The world of emotions is not two-dimensional. Psychological Science 182, 1050---1057 2007
[23]
Gabrielsson, A.: The influence of musical structure on emotional expression, pp. 223---248. Oxford University Press 2001
[24]
Han, B.J., Dannenberg, R.B., Hwang, E.: SMERS: music emotion recognition using support vector regression. In: Proc. of the 10th International Society for Music Information Retrieval ISMIR Conference, pp. 651---656 2009
[25]
Hevner, K.: Expression in music: a discussion of experimental studies and theories. Psychological Review 422, 186---204 1935
[26]
Hevner, K.: Experimental studies of the elements of expression in music. The American Journal of Psychology 482, 246---268 1936
[27]
Hu, X., Downie, J.S.: Exploring mood metadata: relationships with genre, artist and usage metadata. In: Proc. of the 8th International Conference on Music Information Retrieval ISMIR, pp. 67---72 2007
[28]
Huq, A., Bello, J.P., Rowe, R.: Automated music emotion recognition: A systematic evaluation. Journal of New Music Research 393, 227---244 2010
[29]
Kim, J.H., Lee, S., Kim, S.M., Yoo, W.Y.: Music mood classification model based on Arousal-Valence values. In: Proc. of the 2nd International Conference on Advancements in Computing Technology ICACT, pp. 292---295 2011
[30]
Kim, Y.E., Schmidt, E.M., Emelle, L.: Moodswings: A collaborative game for music mood label collection. In: Proc. of the International Society for Music Information Retrieval ISMIR Conference, pp. 231---236 2008
[31]
Kim, Y.E., Schmidt, E.M., Migneco, R., Morton, B.G.: Music emotion recognition: a state of the art review. In: 11th International Society for Music Information Retrieval ISMIR Conference, pp. 255---266 2010
[32]
Kolozali, S., Fazekas, G., Barthet, M., Sandler, M.: Knowledge representation issues in musical instrument ontology design. In: 12th International Society for Music Information Retrieval Conference ISMIR, Miami, USA, Florida, pp. 465---470 2011
[33]
Krumhansl, C.L.: An exploratory study of musical emotions and psychophysiology. Canadian Journal of Experimental Psychology 514, 336---353 1997
[34]
Laukka, P., Elfenbein, H.A., Chui, W., Thingujam, N.S., Iraki, F.K., Rockstuhl, T., Althoff, J.: Presenting the VENEC corpus: Development of a cross-cultural corpus of vocal emotion expressions and a novel method of annotation emotion appraisals. In: Proc. of the LREC Workshop on Corpora for Research on Emotion and Affect, pp. 53---57. European Language Resources Association, Paris 2010
[35]
Laurier, C., Grivolla, J., Herrera, P.: Multimodal music mood classification using audio and lyrics. In: Proc. of the Conference on Machine Learning and Applications ICMLA, pp. 688---693 2008
[36]
LeDoux, J.E.: The emotional brain: the mysterious underpinnings of emotional life. Touchstone, New York 1998
[37]
Lee, J.A., Downie, J.S.: Survey of music information needs, uses, and seeking behaviors: preliminary findings. In: Proc. of the 5th International Society for Music Information Retrieval ISMIR Conference, pp. 441---446 2004
[38]
Lee, S., Kim, J.H., Kim, S.M., Yoo, W.Y.: Smoodi: Mood-based music recommendation player. In: Proc. of the IEEE International Conference on Multimedia and Expo. ICME, pp. 1---4 2011
[39]
Lesaffre, M., Leman, M., Martens, J.P.: A user oriented approach to music information retrieval. In: Proc. of the Content-Based Retrieval Conference Published online, Daghstul Seminar Proceedings, Germany, Wadern 2006
[40]
Li, T., Ogihara, M.: Detecting emotion in music. In: Proc. International Society of Music Information Retrieval Conference, pp. 239---240 2003
[41]
Lin, Y.C., Yang, Y.H., Chen, H.H.: Exploiting online music tags for music emotion classification. ACM Transactions on Multimedia Computing Communications and Applications 7S1, 26:1---26:15 2011
[42]
Lin, Y.C., Yang, Y.H., Chen, H.H., Liao, I.B., Ho, Y.C.: Exploiting genre for music emotion classification. In: Proc. of the IEEE International Conference on Multimedia and Expo. ICME, pp. 618---621 2009
[43]
Lu, L., Liu, D., Zhang, H.J.: Automatic mood detection and tracking of music audio signals. IEEE Trans. on Audio, Speech, and Language Proc. 141, 5---18 2006
[44]
Mann, M., Cox, T.J., Li, F.F.: Music mood classification of television theme tunes. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 735---740 2011
[45]
McVicar, M., Freeman, T., De Bie, T.: Mining the correlation between lyrical and audio features and the emergence of mood. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 783---788 2011
[46]
Meyer, L.B.: Emotion and meaning in music. The University of Chicago press 1956
[47]
MIREX: Audio mood classification AMC results 2009, http://www.music-ir.org/mirex/wiki/2009:Audio_Music_Mood_Classification_Results
[48]
Mortillaro, M., Meuleman, B., Scherer, R.: Advocating a componential appraisal model to guide emotion recognition. International Journal of Synthetic Emotions 31, 18---32 2012
[49]
Myint, E.E.P., Pwint, M.: An approach for multi-label music mood classification. In: 2nd International Conference on Signal Processing Systems ICSPS, vol. VI, pp. 290---294 2010
[50]
Ogihara, M., Kim, Y.: Mood and emotional classification. In: Music Data Mining. CRC Press 2011
[51]
Osgood, C.E., May, W.H., Miron, M.S.: Cross-Cultural Universals of Affective Meaning. University of Illinois Press, Urbana 1975
[52]
Osgood, C.E., Suci, G.J., Tannenbaum, P.H.: The measurement of meaning. University of Illinois Press, Urbana 1957
[53]
Parke, R., Chew, E., Kyriakakis, C.: Quantitative and visual analysis of the impact of music on perceived emotion of film. Computers in Entertainment CIE 53 2007
[54]
Raimond, Y., Abdallah, S., Sandler, M., Frederick, G.: The music ontology. In: Proc. of the 7th International Conference on Music Information Retrieval ISMIR, Vienna, Austria, pp. 417---422 2007
[55]
Raimond, Y., Giasson, F., Jacobson, K., Fazekas, G., Gangler, T.: Music ontology specification November 2010, http://musicontology.com/
[56]
Roseman, I.J., Smith, C.A.: Appraisal theory: Overview, assumptions, varieties, controversies. In: Scherer, K.R., Schorr, A., Johnstone, T. eds. Appraisal Processes in Emotion: Theory, Methods, Research, pp. 3---19. Oxford University Press, New York 2001
[57]
Russell, J.A.: A circumplex model of affect. Journal of Personality and Social Psychology 396, 1161---1178 1980
[58]
Saari, P., Eerola, T., Lartillot, O.: Generalizability and simplicity as criteria in feature selection: application to mood classification in music. IEEE Trans. on Audio, Speech, and Language Proc. 196, 1802---1812 2011
[59]
Sanden, C., Zhang, J.: An empirical study of multi-label classifiers for music tag annotation. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 717---722 2011
[60]
Scherer, K.R., Brosch, T.: Culture-specific appraial biases contribute to emotion disposition. European Journal of Personality 288, 265---288 2009
[61]
Scherer, K.R., Schorr, A., Johnstone, T.: Appraisal processes in emotion: Theory, methods, research. Oxford University Press, New York 2001
[62]
Schlosberg, H.: The description of facial expressions in terms of two dimensions. Journal of Experimental Psychology 44, 229---237 1952
[63]
Schmidt, E.M., Kim, Y.E.: Prediction of time-varying musical mood distributions from audio. In: Proc. of the 11th International Society for Music Information Retrieval ISMIR Conference, pp. 465---470 2010
[64]
Schmidt, E.M., Kim, Y.E.: Prediction of time-varying musical mood distributions using Kalman filtering. In: Proc. of the 9th International Conference on Machine Learning and Applications ICMLA, pp. 655---660 2010
[65]
Schmidt, E.M., Kim, Y.E.: Modeling musical emotion dynamics with conditional random fields. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 777---782 2011
[66]
Schmidt, E.M., Turnbull, D., Kim, Y.E.: Feature selection for content-based, time-varying musical emotion regression. In: Proc. of the 11th ACM SIGMM International Conference on Multimedia Information Retrieval MIR, pp. 267---273 2010
[67]
Schubert, E.: Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space. Australian Journal of Psychology 513, 154---165 1999
[68]
Schubert, E.: Update of the Hevner adjective checklist. Perceptual and Motor Skills, pp. 117---1122 2003
[69]
Schubert, E.: Continuous self-report methods. In: Juslin, P.N., Sloboda, J.A. eds. Handbook of Music and Emotion, pp. 223---253. Oxford University Press 2010
[70]
Schuller, B., Weninger, F., Dorfner, J.: Multi-modal non-prototypical music mood analysis in continous space: reliability and performances. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 759---764 2011
[71]
Sloboda, J.A., Juslin, P.N.: Psychological perspectives on music and emotion. In: Juslin, P.N., Sloboda, J.A. eds. Music and Emotion Theory and Research. Series in Affective Science, pp. 71---104. Oxford University Press 2001
[72]
Thayer, J.F.: Multiple indicators of affective responses to music. Dissertation Abstracts International 4712 1986
[73]
Thompson, W.F., Robitaille, B.: Can composers express emotions through music? Empirical Studies of the Arts 101, 79---89 1992
[74]
Trohidis, K., Tsoumakas, G., Kalliris, G., Vlahavas, I.: Multi-label classification of music into emotions. In: Proc. International Society of Music Information Retrieval Conference, pp. 325---330 2008
[75]
Tsunoo, E., Akase, T., Ono, N., Sagayama, S.: Music mood classification by rhythm and bass-line unit pattern analysis. In: Proc. of the International Conference on Acoustics, Speech, and Signal Processing ICASSP, pp. 265---268 2010
[76]
Turnbull, D., Barrington, L., Torres, D., Lanckriet, G.: Towards musical query by semantic description using the CAL500 data set. In: Proc. of the ACM Special Interest Group on Information Retrieval SIGIR, pp. 439---446 2007
[77]
Vaizman, Y., Granot, R.Y., Lanckriet, G.: Modeling dynamic patterns for emotional content in music. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 747---752 2011
[78]
Vuoskoski, J.K.: Measuring music-induced emotion: A comparison of emotion models, personality biases, and intensity of experiences. Musicae Scientiae 152, 159---173 2011
[79]
Waletzky, J.: Bernard Hermann: Music For the Movies. DVD Les Films d'Ici / Alternative Current 1992
[80]
Wang, J., Anguerra, X., Chen, X., Yang, D.: Enriching music mood annotation by semantic association reasoning. In: Proc. of the International Conference on Multimedia, pp. 1445---1450 2010
[81]
Wang, X., Chen, X., Yang, D., Wu, Y.: Music emotion classification of Chinese songs based on lyrics using TF*IDF and rhyme. In: Proc. of the 12th International Society for Music Information Retrieval ISMIR Conference, pp. 765---770 2011
[82]
Wehrle, T., Scherer, K.R.: Toward computational modelling of appraisal theories. In: Scherer, K.R., Schorr, A., Johnstone, T. eds. Appraisal Processes in Emotion: Theory, Methods, Research, pp. 92---120. Oxford University Press, New York 2001
[83]
Whissell, C.M.: The dictionary of affect in language. In: Plutchik, R., Kellerman, H. eds. Emotion: Theory Research and Experience, vol. 4, pp. 113---131. Academic Press, New York 1989
[84]
Wieczorkowska, A., Synak, P., Ras, Z.W.: Multi-label classification of emotions in music. In: Proc. of Intelligent Information Processing and Web Mining, pp. 307---315 2006
[85]
Yang, Y.H., Chen, H.H.: Ranking-based emotion recognition for music organisation and retrieval. IEEE Trans. on Audio, Speech, and Language Proc. 194, 762---774 2010
[86]
Yang, Y.H., Chen, H.H.: Music emotion recognition. In: Multimedia Computing. Communication and Intelligence Series. CRC Press 2011
[87]
Yang, Y.H., Chen, H.H.: Prediction of the distribution of perceived music emotions using discrete samples. IEEE Trans. on Audio, Speech, and Language Proc. 197, 2184---2195 2011
[88]
Yang, Y.H., Lin, Y.C., Su, Y.F., Chen, H.H.: A regression approach to music emotion recognition. IEEE Trans. on Audio, Speech, and Language Proc. 162, 448---457 2008
[89]
Yang, Y.H., Liu, C.C., Chen, H.H.: Music emotion classification: A fuzzy approach. In: Proc. of the 14th Annual ACM International Conference on Multimedia, Santa Barbara, CA, USA, pp. 81---84 2006
[90]
Yoo, M.J., Lee, I.K.: Affecticon: emotion-based icons for music retrieval. IEEE Computer Graphics and Applications 313, 89---95 2011
[91]
Zentner, M., Grandjean, D., Scherer, K.R.: Emotions evoked by the sound of music: Differentiation, classification, and measurement. Emotion 84, 494---521 2008
[92]
Zhao, Y., Yang, D., Chen, X.: Multi-modal music mood classification using co-training. In: International Conference on Computational Intelligence and Software Engineering CiSE, pp. 1---4 2010
[93]
Zhao, Z., Xie, L., Liu, J., Wu, W.: The analysis of mood taxonomy comparison between Chinese and Western music. In: Proc. of the 2nd International Conference on Signal Processing Systems ICSPS, vol. VI, pp. 606---610 2010

Cited By

View all
  • (2023)Affective gaming using adaptive speed controlled by biofeedbackCompanion Publication of the 25th International Conference on Multimodal Interaction10.1145/3610661.3616124(238-246)Online publication date: 9-Oct-2023
  • (2022)Design of Emotion-Driven Game Interaction Using BiosignalsHCI in Games10.1007/978-3-031-05637-6_10(160-179)Online publication date: 26-Jun-2022
  • (2021)Emotion Annotation of Music: A Citizen Science ApproachCollaboration Technologies and Social Computing10.1007/978-3-030-85071-5_4(51-66)Online publication date: 31-Aug-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
CMMR 2012: Revised Selected Papers of the 9th International Symposium on From Sounds to Music and Emotions - Volume 7900
June 2012
499 pages
ISBN:9783642412479

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 19 June 2012

Author Tags

  1. appraisal
  2. arousal
  3. metadata
  4. model
  5. mood
  6. multi-modal
  7. music emotion
  8. ontology
  9. recognition
  10. retrieval
  11. review
  12. state of the art
  13. valence

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 28 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Affective gaming using adaptive speed controlled by biofeedbackCompanion Publication of the 25th International Conference on Multimodal Interaction10.1145/3610661.3616124(238-246)Online publication date: 9-Oct-2023
  • (2022)Design of Emotion-Driven Game Interaction Using BiosignalsHCI in Games10.1007/978-3-031-05637-6_10(160-179)Online publication date: 26-Jun-2022
  • (2021)Emotion Annotation of Music: A Citizen Science ApproachCollaboration Technologies and Social Computing10.1007/978-3-030-85071-5_4(51-66)Online publication date: 31-Aug-2021
  • (2018)Towards a Semantic Architecture for the Internet of Musical ThingsProceedings of the 23rd Conference of Open Innovations Association FRUCT10.5555/3299905.3299957(382-390)Online publication date: 19-Nov-2018
  • (2018)Artificial Empathic MemoryProceedings of the 2018 Workshop on Understanding Subjective Attributes of Data, with the Focus on Evoked Emotions10.1145/3267799.3267801(1-8)Online publication date: 15-Oct-2018
  • (2018)The Perceived Emotion of Isolated Synthetic AudioProceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion10.1145/3243274.3243277(1-8)Online publication date: 12-Sep-2018
  • (2018)Review of data features-based music emotion recognition methodsMultimedia Systems10.1007/s00530-017-0559-424:4(365-389)Online publication date: 1-Jul-2018
  • (2017)The mood of Chinese Pop musicJournal of the Association for Information Science and Technology10.1002/asi.2381368:8(1899-1910)Online publication date: 1-Aug-2017
  • (2017)A framework for evaluating multimodal music mood classificationJournal of the Association for Information Science and Technology10.1002/asi.2364968:2(273-285)Online publication date: 1-Feb-2017
  • (2016)Music subject classification based on lyrics and user interpretationsProceedings of the 79th ASIS&T Annual Meeting: Creating Knowledge, Enhancing Lives through Information & Technology10.5555/3017447.3017488(1-10)Online publication date: 14-Oct-2016
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media