Abstract
Currently, the focus of research on human affect recognition has shifted from six basic emotions to complex affect recognition in continuous two or three dimensional space due to the following challenges: (i) the difficulty in representing and analyzing large number of emotions in one framework, (ii) the problem of representing complex emotions in the framework, and (iii) the lack of validation of the framework through measured signals, and (iv) the lack of applicability of the selected framework to other aspects of affective computing. This paper presents a Valence – Arousal – Dominance framework to represent emotions. This framework is capable of representing complex emotions on continuous 3D space. To validate the model, an affect recognition technique has been proposed that analyses spontaneous physiological (EEG) and visual cues. The DEAP dataset is a multimodal emotion dataset which contains video and physiological signals as well as Valence, Arousal and Dominance values. This dataset has been used for multimodal analysis and recognition of human emotions. The results prove the correctness and sufficiency of the proposed framework. The model has also been compared with other two dimensional models and the capacity of the model to represent many more complex emotions has been discussed.
Similar content being viewed by others
References
Arifin S, Cheung PYK (2008) Affective level video segmentation by utilizing the pleasure-arousal-dominance information. IEEE Trans Multimed 10(7):1325–1341
Emotion Article: www.measuredme.com visited on 20 July, 2014.
Caridakis, G., Malatesta, L., Kessous, L., Amir, N., Paouzaiou, A., &Karpouzis, K. (2006, November). Modelling naturalistic affective states via facial and vocal expression recognition. In Proceedings 8th ACM International Conference on Multimodal Interfaces (ICMI’06), Banff, Alberta, Canada (pp. 146–154). ACM Publishing.
Chung S. Y., Yoon H. J. (2012) Affective classification using Bayesian classifier and supervised learning. 12th Int Conf Control, Autom Syst (ICCAS) Island: pp. 1768–1771.
Ekman P, Friesen WV, O’Sullivan M, Chan A, Diacoyanni-Tarlatzis I, Heider K, Krause R, LeCompte WA, Pitcairn T, Ricci-Bitti PE, Scherer K, Tomita M, Tzavaras A (1987) Universals and cultural differences in the judgments of facial expressions of emotion. J Pers Soc Psychol 53:12–717
Fragopanagos F, Taylor JG (2005) Emotion recognition in human-computer interaction. Neural Netw 18:389–405. doi:10.1016/j.neunet.2005.03.006
Glowinski, D., Camurri, A., Volpe, G., Dael, N. and Scherer K (2008) Technique for automatic emotion recognition by body gesture analysis. Proc. IEEE CS Conf. Computer vision and pattern recognition workshops, pp. 1–6.
Gunes H. and Pantic M. (2010) Automatic measurement of affect in dimensional and continuous spaces: why, what, and how? Proc Seventh Int’,l Conf Methods Tech Behav Res, pp. 122–126.
Gunes H, Schuller B (2012) Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis Comput 31(2):120–135
Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31
Liu Y, Sourina O (2013) Real-time fractal-based valence level recognition from EEG. Trans Comput Sci XVIII Lect Notes Comput Sci 7848:101–120
Mansoorizadeh M, Charkari NM (2010) Multimodal information fusion application to human emotion recognition from face and speech. Multimed Tools Appl 49(2):277–297
Morris JD (1995) SAM: the self-assessment manikin. An efficient cross-cultural measurement of emotion response.Jounal of. Advert Res 35(8):63–68
Nicolaou M, GunesHand PM (2011) Continuous prediction of spontaneous affect from multiple cues and modalities in valence-arousal space. IEEE Trans Affect Comput 2(2):92–105
Picard R (2003) Affective computing: challenges. Inter J Hum Comput Stud 59(1–2):55–64
Saha, A., Jonathan, Q. M. (2010) Facial Expression Recognition using Curvelet based local binary patterns. IEEE Int Conf Acoust Speech Signal Proc (ICASSP), pp. 2470–2473.
Christopher P. Said, James V. Haxby, Alexander Todorov.(2011) Brain systems for assessing the affective value of faces. Phil Trans R Soc London B Biol Sci. 2011 Jun 12:366(1571):1660–1670. doi: 10.1098/rstb.2010.0351.
Schachter S, Singer JE (1962) Cognitive, social and physiological determinants of emotional state. Psychol Rev 69:379–399
Schuller B (2009) Acoustic emotion recognition: a benchmark comparison of performances. Proc, IEEE ASRU
Schuller B (2011) Recognizing affect from linguistic information in 3D continuous space. IEEE Trans Affect Comput 2(4):192–205
Smith CA, Ellsworth PC (1985) Patterns of cognitive appraisal in emotion. J Pers Soc Psychol 48(4):813–838
Stickel C, Fink J, Holzinger A (2007) Enhancing universal access – EEG based learnability assessment. Lect Notes Comput Sci 4556:813–822
Sumana I, Islam M, Zhang DS, Lu G (2008) Content based image retrieval using curvelet transform. Proc. of IEEE International Workshop on Multimedia Signal Processing, Cairns, Queensland, Australia, pp. 11–16
Verma G. K. and Tiwary U. S (2014) Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals.Vol. 102, Part 1, Pages 162–172 NeuroImage. doi:10.1016/j.neuroimage.2013.11.007.
Viola PA, Jones MJ (2001) Rapid object detection using a boosted cascade of simple features. CVPR, Issue 1:511–518
Wang Y, Guan L, Venetsanopoulos A (2012) Kernel cross-modal factor analysis for information fusion with application to bimodal emotion recognition. Multimed, IEEE Trans on 14(3):597–607
Whissell CM (1989) The dictionary of affect in language, emotion: theory, research and experience, vol 4. Academic Press, New York
Wollmer M, Schuller B, Eyben F, Rigoll G (2010) Combining long short-term memory and dynamic Bayesian networks for incremental emotion-sensitive artificial listening. IEEE J Sel Top Signal Proc 4(5):867–881
Wu, X., Zhao, J. (2010) Curvelet feature extraction for face recognition and facial expression recognition. Sixth Int Conf Nat Comput (ICNC), pp. 1212–1216.
Yoon HJ, Chung SY (2013) Eeg-based emotion estimation using bayesian weighted-log-posterior function and perceptron convergence algorithm. Comput Biol Med 43:2230–2237
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Verma, G.K., Tiwary, U.S. Affect representation and recognition in 3D continuous valence–arousal–dominance space. Multimed Tools Appl 76, 2159–2183 (2017). https://doi.org/10.1007/s11042-015-3119-y
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-015-3119-y