Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3395035.3425246acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper

It's Not What They Play, It's What You Hear: Understanding Perceived vs. Induced Emotions in Hindustani Classical Music

Published: 27 December 2020 Publication History

Abstract

Music is an efficient medium to elicit and convey emotions. The comparison between perceived and induced emotions from western music has been widely studied. However, this relationship has not been studied from the perspective of Hindustani classical music. In this work, we explore the relationship between perceived and induced emotions with Hindustani classical music as our stimuli. We observe that there is little to no correlation between them, however, audio features help in distinguishing the increase or decrease in induced emotion quality. We also introduce a novel dataset which contains induced valence and arousal annotations for 18 Hindustani classical music songs. Furthermore, we propose a latent space representation based approach, that leads to a relative increase in F1 Score of 32.2% for arousal and 34.5% for valence classification, as compared to feature-based approaches for Hindustani classical music.

References

[1]
Tzu-Han Cheng and Chen-Gia Tsai. 2016. Female Listeners' Autonomic Responses to Dramatic Shifts Between Loud and Soft Music/Sound Passages: A Study of Heavy Metal Songs.Frontiers in psychology 7 (17 Feb 2016), 182--182. https://doi.org/10.3389/fpsyg.2016.00182 26925009[pmid].
[2]
Dan-Ning Jiang, Lie Lu, Hong-Jiang Zhang, Jian-Hua Tao, and Lian-Hong Cai. 2002. Music type classification by spectral contrast feature. In Proceedings. IEEE International Conference on Multimedia and Expo, Vol. 1. 113--116 vol.1.
[3]
SYLVIE DROIT-VOLET, danilo Ramos, Lino Bueno, and Emmanuel Bigand. 2013. Music, emotion, and time perception: the influence of subjective emotional valence and arousal? Frontiers in Psychology 4 (2013), 417. https://doi.org/10.3389/fpsyg.2013.00417
[4]
S. Dubnov. 2004. Generalization of spectral flatness measure for non-Gaussianlinear processes. IEEE Signal Processing Letters 11, 8 (2004), 698--701.
[5]
A Gabrielsson. 2001. Perceived emotion or felt emotion: Same or different? Musicae Scientiae(2001), 123--147.
[6]
Christopher Harte, Mark Sandler, and Martin Gasser. 2006. Detecting Harmonic Change in Musical Audio. In Proceedings of the 1st ACM Workshop on Audio and Music Computing Multimedia(Santa Barbara, California, USA)(AMCMM'06). Association for Computing Machinery, New York, NY, USA, 21--26. https://doi.org/10.1145/1178723.1178727
[7]
Anssi Klapuri and Manuel Davy. 2006.Signal Processing Methods for Music Transcription. https://doi.org/10.1007/0--387--32845--9
[8]
Marko Kos and Damjan Vlaj. 2013. Acoustic classification and segmentation using modified spectral roll-off and variance-based features. Digital Signal Processing23, 2 (2013), 659 -- 674. https://doi.org/10.1016/j.dsp.2012.10.008
[9]
Gunter Kreutz, Ulrich Ott, Daniel Teichmann, Patrick Osawa, and Dieter Vaitl.2008. Using music to induce emotions: Influences of musical preference and absorption. Psychology of Music 36, 1 (2008), 101--126. https://doi.org/10.1177/0305735607082623 arXiv: https://doi.org/10.1177/0305735607082623
[10]
Hayoung A. Lim and Angela Watson. 2018. Perceived and Induced Emotion Responses to Popular Music: Categorical and Dimensional Models. Journal of Music Therapy 55 (2018), 109--131.
[11]
Avantika Mathur, Suhas H. Vijayakumar, Bhismadev Chakrabarti, and Nandini C. Singh. 2015. Emotional responses to Hindustani raga music: the role of musical structure. Frontiers in psychology(2015). https://www.ncbi.nlm.nih.gov/ /25983702 25983702[pmid].
[12]
Martin McKinney, Jeroen Breebaart, and Prof (wy. 2003. Features for Audio and Music Classification. (11 2003).
[13]
D. Melhart, A. Liapis, and G. N. Yannakakis. 2019. PAGAN: Video Affect Annotation Made Easy. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). 130--136.
[14]
Meinard Müller, Frank Kurth, and Michael Clausen. 2005. Audio Matchingvia Chroma-Based Statistical Features. In Proceedings of the 6th International Conference on Music Information Retrieval (ISMIR). 288--295.
[15]
Daniel Müllensiefen, Bruno Gingras, Jason Musil, and Lauren Stewart. 2014. The Musicality of Non-Musicians: An Index for Assessing Musical Sophistication in the General Population. PLOS ONE9, 2 (02 2014), 1--23. https://doi.org/10.1371/journal.pone.0089642
[16]
C. Panagiotakis and G. Tziritas. 2005. A speech/music discriminator based on RMS and zero-crossings. IEEE Transactions on Multimedia 7, 1 (2005), 155--166.
[17]
Braja Patra, Dipankar Das, and Sivaji Bandyopadhyay. 2016. Multimodal Mood Classification Framework for Hindi Songs. Computación y Sistemas 20 (10 2016),515--526. https://doi.org/10.13053/CyS-20--3--2461
[18]
Fabiana Silva Ribeiro, Flávia Heloísa Santos, Pedro Barbas Albuquerque, and Patrícia Oliveira-Silva. 2019. Emotional Induction Through Music: Measuring Cardiac and Electrodermal Responses of Emotional States and Their Persistence. Frontiers in Psychology 10 (2019), 451. https://doi.org/10.3389/fpsyg.2019.00451
[19]
James Russell. 1980. A Circumplex Model of Affect.Journal of Personality and Social Psychology 39 (12 1980), 1161--1178. https://doi.org/10.1037/h0077714
[20]
Yading Song, S. Dixon, Marcus Pearce, and Andrea Halpern. 2016. Perceived and Induced Emotion Responses to Popular Music: Categorical and Dimensional Models. Music Perception: An Interdisciplinary Journal 33 (04 2016), 472--492.https://doi.org/10.1525/mp.2016.33.4.472
[21]
JULIAN F. THAYER and MEREDITH L. FAITH. 2001. A Dynamic Systems Model of Musically Induced Emotions.Annals of the New York Academy of Sciences 930, 1 (2001), 452--456. https://doi.org/10.1111/j.1749--6632.2001.tb05768.x arXiv: https://nyaspubs.onlinelibrary.wiley.com/doi/pdf/10.1111/j.1749--6632.2001.tb05768.x
[22]
Jeffrey M. Valla, Jacob A. Alappatt, Avantika Mathur, and Nandini C. Singh. 2017. Music and Emotion-A Case for North Indian Classical Music.Frontiers in psychology(2017). https://doi.org/10.3389/fpsyg.2017.02115 29312024[pmid].
[23]
Bin Wu, Andrew Horner, and Chung Lee. 2014. Musical Timbre and Emotion: The Identification of Salient Timbral Features in Sustained Musical Instrument Tones Equalized in Attack Time and Spectral Centroid. (01 2014).
[24]
Marcel R. Zentner, Didier Grandjean, and Klaus R. Scherer. 2008. Emotions evoked by the sound of music: characterization, classification, and measurement. Emotion 8 4 (2008), 494--521.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal Interaction
October 2020
548 pages
ISBN:9781450380027
DOI:10.1145/3395035
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 December 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. arousal
  2. hindustani classical music
  3. induced emotions
  4. perceived emotions
  5. valence

Qualifiers

  • Short-paper

Conference

ICMI '20
Sponsor:
ICMI '20: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
October 25 - 29, 2020
Virtual Event, Netherlands

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 128
    Total Downloads
  • Downloads (Last 12 months)19
  • Downloads (Last 6 weeks)4
Reflects downloads up to 25 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media