Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3242969.3243013acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

The Multimodal Dataset of Negative Affect and Aggression: A Validation Study

Published: 02 October 2018 Publication History

Abstract

Within the affective computing and social signal processing communities, increasing efforts are being made in order to collect data with genuine (emotional) content. When it comes to negative emotions and even aggression, ethical and privacy related issues prevent the usage of many emotion elicitation methods, and most often actors are employed to act out different scenarios. Moreover, for most databases, emotional arousal is not explicitly checked, and the footage is annotated by external raters based on observable behavior. In the attempt to gather data a step closer to real-life, previous work proposed an elicitation method for collecting the database of negative affect and aggression that involved unscripted role-plays between aggression regulation training actors (actors) and naive participants (students), where only short role descriptions and goals are given to the participants. In this paper we present a validation study for the database of negative affect and aggression by investigating whether the actors' behavior (e.g. becoming more aggressive) had a real impact on the students' emotional arousal. We found significant changes in the students' heart rate variability (HRV) parameters corresponding to changes in aggression level and emotional states of the actors, and therefore conclude that this method can be considered as a good candidate for emotion elicitation.

References

[1]
F. Agrafioti, D. Hatzinakos, and A.K. Anderson. 2012. ECG Pattern Analysis for Emotion Detection. IEEE Trans. Affect. Comput. 3, 1 (Jan. 2012), 102--115.
[2]
L. Feldman Barret. 2006. Are emotions natural kinds? Perspectives on Psychological Science 1, 1 (March 2006), 25--85.
[3]
A. Batliner, S. Steidl, and E. Nöth. 2008. Releasing a thoroughly annotated and processed spontaneous emotional database: the FAU Aibo Emotion Corpus. In Proc. of a Satellite Workshop of LREC. 28--31.
[4]
G.G. Brentson, J.T. Bigger, D.L. Eckberg, P. Grossman, P.G. Kaufmann, M. Malik, H.N. Nagaraja, S.W. Porges, J.P. Saul, P.H. Stone, and M.W. van der Molen. 1992. Heart rate variability: origins, methods, and interpretive caveats. Physicophysiology 34, 6 (November 1992), 623--648.
[5]
F. Burkhardt, A. Paeschke, M. Rolfes, W.F. Sendlmeier, and B. Weiss. 2005. A database of German emotional speech. In Interspeech, Vol. 5. 1517--1520.
[6]
C. Busso, M. Bulut, C-C. Lee, A. Kazemzadeh, E. Mower, S. Kim, J.N. Chang, S. Lee, and S.N. Narayanan. 2008. IEMOCAP: interactive emotional dyadic motion capture database. Language Resources and Evaluation 42, 4 (2008), 335--359.
[7]
J.T. Cacioppo, Larsen J.T. Berntson, G.G., K.M. Pohlmann, and T.A. Ito. 2000. The psychophysiology of emotion. In Handbook of emotions (2 ed.), M. Lewis and J.M. Haviland-Jones (Eds.). Guilford Press., New York, 173--191.
[8]
O. Celiktutan, E. Skordos, and H. Gunes. 2017. Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement. IEEE Transactions on Affective Computing (2017).
[9]
J. Counotte, R. Pot-Kolder, A.M. van Roon, O. Hoskam, M. van der Gaag, and W. Veling. 2017. High psychosis liability is associated with altered autonomic balance during exposure to Virtual Reality social stressors. Schizophrenia Research 184 (June 2017), 14--20.
[10]
R. Cowie, E. Douglas-Cowie, J-C. Martin, and L. Devillers. 2010. The essential role of human databases for learning in and validation of affectively competent agents. K. Scherer, T. Bänziger et E. Roach, éditeurs, A Blueprint for Affective Computing: a Sourcebook and Manual (2010), 151--165.
[11]
P.J. de Jong, M. van Overveld, and M.L. Peters. 2011. Sympathetic and parasympathetic responses to a core disgust video clip as a function of disgust propensity and disgust sensitivity. Biological Psychology 88, 2-3 (December 2011), 174--179.
[12]
E. Douglas-Cowie, N. Campbell, R. Cowie, and P. Roach. 2003. Emotional speech: Towards a new generation of databases. Speech communication 40, 1 (2003), 33--60.
[13]
O.J. Dunn. 1964. Multiple Comparisons Using Rank Sums. Technometrics 6, 3 (1964), 241--252.
[14]
D.L. Eckberg and M.J. Eckberg. 1982. Human sinus node responses to repetitive, ramped carotid baroreceptor stimuli. American Journal Psychology 242, 4 (April 1982), H638--644.
[15]
P. Ekman. 1994. All emotions are basic. In The Nature of Emotion: Fundamental Questions, P. Ekman and R.J. Davidson (Eds.). Oxford University Press, New York and Oxford, 15--19.
[16]
P. Ekman, R.W. Levenson, and W.V. Friesen. 1983. Autonomic nervous system activity distinguishes among emotions. Science 221, 4616 (1983), 1208--1210.
[17]
J. Gratch, R. Artstein, G. Lucas, G. Stratou, S. Scherer, A. Nazarian, R. Wood, J. Boberg, D. Devault, S. Marsella, D. Traum, A. Rizzo, and L-P. Morency. 2014. The Distress Analysis Interview Corpus of Human and Computer Interviews. In Proc. of the 9th Int. Conf. on Language Resources and Evaluation (LREC'14) (26--31), N. Calzolari, K. Choukri, T. Declerck, H. Loftsson, B. Maegaard, J. Mariani, A. Moreno, J. Odijk, and S. Piperidis (Eds.). European Language Resources Association (ELRA), Reykjavik, Iceland.
[18]
H. Gunes, M. Piccardi, and M. Pantic. 2008. From the Lab to the real world: affect recognition using multiple cues and modalities. In Affective computing: focus on emotion expression, synthesis, and recognition, O. Jimmy (Ed.). InTech Education and Publishing, Vienna, Austria, 185--218. http://doc.utwente.nl/65266/
[19]
M.M. Herrald and J. Tomaka. 2002. Patterns of emotion-specific appraisal, coping, and cardiovascular reactivity during an ongoing emotional episode. Perspectives on Social Psychological 83, 2 (August 2002), 434--450.
[20]
J.P. Jamieson, M.K. Nock, andW.B. Mendes. 2012. Mind over matter: Reappraising arousal improves cardiovascular and cognitive response to stress. Experimental Psychology: General 141, 3 (August 2012), 417--422.
[21]
A.M. Khomami, R. Subramanian, S.M. Kia, P. Avesani, I. Patras, and N. Sebe. 2015. DECAF: MEG-based multimodal database for decoding affective physiological responses. IEEE Transactions on Affective Computing 6, 3 (2015), 209--222.
[22]
S. Koelstra, C. Mühl, M. Soleymani, J.S. Lee, A. Yazdani, Touradj E., T. Pun, A. Nijholt, and I. Patras. 2012. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE transactions on affective computing 3, 1 (2012), 18--31. eemcs-eprint-21368.
[23]
K. Krippendorff. 2007. Computing Krippendorff's alpha reliability. Departmental papers (ASC) (2007), 43.
[24]
W.H. Kruskal and W.A. Wallis. 1952. Use of Ranks in One-Criterion Variance Analysis. J. Amer. Statist. Assoc. 47, 260 (1952), 583--621.
[25]
B.M. Kudielka, A. Buske-Kirschbaum, D.H. Hellhammer, and C Kirschbaum. 2004. International Journal Behavioral Medicine 11, 2 (2004).
[26]
I. Lefter, G.J. Burghouts, and L.J.M. Rothkrantz. 2012. Learning the fusion of audio and video aggression assessment by meta-information from human annotations. In Information Fusion (FUSION), 15th International Conference on. IEEE, 1527--1533.
[27]
I. Lefter, G.J. Burghouts, and L.J.M. Rothkrantz. 2014. An audio-visual dataset of human-human interactions in stressful situations. Journal on Multimodal User Interfaces 8, 1 (2014), 29--41.
[28]
I. Lefter, C.M. Jonker, S.K. Tuente, W. Veling, and S. Bogaerts. 2017. NAA: A multimodal database of negative affect and aggression. In Affective Computing and Intelligent Interaction (ACII), 2017 Seventh International Conference on. IEEE, 21--27.
[29]
R.W. Levenson. 2003. Autonomic specificity and emotion. In Handbook of Affective Sciences, R.J. Davidson, K.R. Scherer, and H.H. Goldsmith (Eds.). Oxford University Press, New York, 212--224.
[30]
M. Malik. 1996. Heart rate variability. Standards of measuement, physiological interpretation, and clinical use. Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology. European Heart Journal 17, 3 (March 1996), 354--381.
[31]
G. McKeown, M. Valstar, R. Cowie, M. Pantic, and M. Schroder. 2012. The SEMAINE database: Annotated multimodal records of emotionally colored conversations between a person and a limited agent. Affective Computing, IEEE Transactions on 3, 1 (2012), 5--17.
[32]
C. Neubauer, M. Chollet, S. Mozgai, M. Dennison, P. Khooshabeh, and S. Scherer. 2017. The relationship between task-induced stress, vocal changes, and physiological state during a dyadic team task. In Proc. of the 19th ACM Int. Conf. on Multimodal Interaction. ACM, 426--432.
[33]
R.W. Picard. 1997. Affective Computing. MIT Press, Cambridge, MA, USA.
[34]
R.W. Picard, E. Vyzas, and J. Healey. 2001. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE transactions on pattern analysis and machine intelligence 23, 10 (2001), 1175--1191.
[35]
F. Ringeval, Bj. Schuller, M. Valstar, R. Cowie, and M. Pantic. 2015. AVEC 2015: The 5th International Audio/Visual Emotion Challenge and Workshop. In Proc. of the 23rd ACM Int. Conf. on Multimedia (MM '15). ACM, New York, NY, USA, 1335--1336.
[36]
F. Ringeval, A. Sonderegger, J. Sauer, and D. Lalanne. 2013. Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions. In Automatic Face and Gesture Recognition (FG), 2013 10th IEEE Int. Conf. and Workshops on. IEEE, 1--8.
[37]
N.A. Roberts, J.L. Tsai, and J.A. Coan. 2007. Emotion elicitation using dyadic interaction tasks. Handbook of emotion elicitation and assessment (2007), 106--123.
[38]
K. Roelofs, M. A. Hagenaars, and J. Stins. 2010. Facing freeze: Social threat induces bodily freeze in humans. Psychological Science 21, 11 (2010), 1575--1581.
[39]
K.R. Scherer and T. Bänziger. 2010. On the use of actor portrayals in research on emotional expression. In Blueprint for affective computing: A sourcebook, K.R. Scherer, T. Bänziger, and E.B. Roesch (Eds.). Oxford, England: Oxford university Press, 166--176.
[40]
G. Stemmler. 2003. Physiological processes during emotion. In The Regulation of Emotion, P. Philippot and R.S. Feldman (Eds.). Erlbaum, Mahwah, NJ, 33--70.
[41]
S.E. Taylor. 1991. Asymmetrical effects of positive and negative events: the mobilization-minimization hypothesis. Psychological Bulletin 110, 1 (1991), 67--85.
[42]
G. Valenza, L. Citi, A. Lanatá, E.P. Scilingo, and R. Barbieri. 2014. Revealing real-time emotional responses: a personalized assessment based on heartbeat dynamics. Scientific reports 4 (2014), 4998.
[43]
W. Zajdel, J.D. Krijnders, T. Andringa, and D.M. Gavrila. 2007. CASSANDRA: audio-video sensor fusion for aggression detection. In 2007 IEEE Conf. on Advanced Video and Signal Based Surveillance. 200--205.

Cited By

View all
  • (2022)A Cross-Corpus Speech-Based Analysis of Escalating Negative InteractionsFrontiers in Computer Science10.3389/fcomp.2022.7498044Online publication date: 7-Mar-2022
  • (2020)MMGatorAuthProceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418881(370-377)Online publication date: 21-Oct-2020
  • (2020)Is She Truly Enjoying the Conversation?Proceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418844(315-323)Online publication date: 21-Oct-2020

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICMI '18: Proceedings of the 20th ACM International Conference on Multimodal Interaction
October 2018
687 pages
ISBN:9781450356923
DOI:10.1145/3242969
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • SIGCHI: Specialist Interest Group in Computer-Human Interaction of the ACM

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 October 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective computing
  2. dataset validity
  3. multimodal datasets
  4. virtual reality therapy

Qualifiers

  • Research-article

Conference

ICMI '18
Sponsor:
  • SIGCHI

Acceptance Rates

ICMI '18 Paper Acceptance Rate 63 of 149 submissions, 42%;
Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2022)A Cross-Corpus Speech-Based Analysis of Escalating Negative InteractionsFrontiers in Computer Science10.3389/fcomp.2022.7498044Online publication date: 7-Mar-2022
  • (2020)MMGatorAuthProceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418881(370-377)Online publication date: 21-Oct-2020
  • (2020)Is She Truly Enjoying the Conversation?Proceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418844(315-323)Online publication date: 21-Oct-2020

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media