Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

FAMOS: a framework for investigating the use of face features to identify spontaneous emotions

Published: 25 May 2019 Publication History

Abstract

Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real-world scenarios (e.g. lightning, complex backgrounds, facial hair and so on), automatically obtaining affective information from face accurately is a very challenging accomplishment. This work presents a framework which aims to analyse emotional experiences through spontaneous facial expressions. The method consists of a new four-dimensional model, called FAMOS, to describe emotional experiences in terms of appraisal, facial expressions, mood, and subjective experiences using a semi-automatic facial expression analyser as ground truth for describing the facial actions. In addition, we present an experiment using a new protocol proposed to obtain spontaneous emotional reactions. The results have suggested that the initial emotional state described by the participants of the experiment was different from that described after the exposure to the eliciting stimulus, thus showing that the used stimuli were capable of inducing the expected emotional states in most individuals. Moreover, our results pointed out that spontaneous facial reactions to emotions are very different from those in prototypic expressions, especially in terms of expressiveness.

References

[1]
Adolphs R (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev 1(1):21---62.
[2]
Bartlett M, Littlewort G, Frank M, Lainscsek C, Fasel I, Movellan J (2005) Recognizing facial expression: machine learning and application to spontaneous behavior. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 2, pp 568---573.
[3]
Bazzo J, Lamar M (2004) Recognizing facial actions using gabor wavelets with neutral face average difference. In: Proceedings of IEEE international conference on automatic face and gesture recognition, pp 505---510.
[4]
Black MJ, Yacoob Y (1997) Recognizing facial expressions in image sequences using local parameterized models of image motion. Int J Comput Vis 25(1):23---48.
[5]
Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV library, 2nd edn. O'Reilly Media, Inc., Sebastopol
[6]
Capurso A, Foundation MR (1952) Music and your emotions: a practical guide to music selections associated with desired emotional responses. Liveright Publishing Corporation, New York
[7]
Chanel G, Kronegg J, Grandjean D, Pun T (2006) Emotion assessment: arousal evaluation using EEG's and peripheral physiological signals. In: Gunsel B, Jain A, Tekalp A, Sankur B (eds) Multimedia content representation, classification and security, vol 4105. Lecture notes in computer science. Springer, Berlin, pp 530---537.
[8]
Chuang C, Shih F (2006) Recognizing facial action units using independent component analysis and support vector machine. Pattern Recognit 39(9):1795---1798.
[9]
Cohen I, Sebe N, Gozman F, Cirelo MC, Huang TS (2003) Learning Bayesian network classifiers for facial expression recognition both labeled and unlabeled data. In: Proceedings of the IEEE international conference on computer vision and pattern recognition, vol 1, pp I---595.
[10]
Cohn JF, Schmidt KL (2004) The timing of facial motion in posed and spontaneous smiles. J Wavelets Multi-resolution Inf Process 2:1---12
[11]
Colibazzi T, Posner J, Wang Z, Gorman D, Gerber A, Yu S, Zhu H, Kangarlu A, Duan Y, Russell J, Peterson B (2010) Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10(3):377---389.
[12]
Dael N, Mortillaro M, Scherer KR (2012) Emotion expression in body action and posture. Emotion 12(5):1085---1101.
[13]
Daros AR, Zakzanis KK, Ruocco AC (2013) Facial emotion recognition in borderline personality disorder. Psychol Med 43:1953---1963.
[14]
Darwin C (1998) The expression of the emotions in man and animals, 3rd edn. Oxford University Press, Oxford
[15]
De la Torre F, Cohn J (2011) Facial expression analysis. In: Moeslund TB, Hilton A, Krüger V, Sigal L (eds) Visual analysis of humans, pp 377---409.
[16]
Deruelle C, Rondan C, Gepner B, Tardif C (2004) Spatial frequency and face processing in children with autism and Asperger syndrome. J Autism Dev Disord 34(2):199---210.
[17]
Donges US, Kersting A, Suslow T (2012) Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming. PloS One 7(7):e41745.
[18]
Dornaika F, Moujahid A, Raducanu B (2013) Facial expression recognition using tracked facial actions: classifier performance analysis. Eng Appl Artif Intell 26(1):467---477.
[19]
Douglas-Cowie E, Campbell N, Cowie R, Roach P (2003) Emotional speech: towards a new generation of databases. Speech Commun 40(1):33---60
[20]
Duthoit CJ, Sztynda T, Lal SKL, Jap BT, Agbinya JI (2008) Optical flow image analysis of facial expressions of human emotion: Forensic applications. In: Proceedings of the 1st international conference on forensic applications and techniques in telecommunications, information, and multimedia and workshop, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), e-Forensics '08, pp 5:1---5:6
[21]
Eerola T, Vuoskoski J (2010) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18---49.
[22]
Ekman P, Friesen W (1976) Measuring facial movement. Environ Psychol Nonverbal Behav 1(1):56---75.
[23]
Ekman P, Friesen WV, O'Sullivan M, Chan A, Diacoyanni-Tarlatzis I, Heider K, Krause R, LeCompte WA, Pitcairn T, Ricci-Bitti PE et al (1987) Universals and cultural differences in the judgments of facial expressions of emotion. J Personal Soc Psychol 53(4):712---717.
[24]
Ekman P, Friesen W, Hager J (2002) Facial action coding system (FACS): manual. A Human Face
[25]
Ellsworth PC, Scherer KR (2003) Appraisal processes in emotion. Handb Affect Sci 572:572---595
[26]
Fontaine J, Scherer K, Roesch E, Ellsworth P (2007) The world of emotions is not two-dimensional. Psycholical Sci 18(12):1050---1057.
[27]
Gašpar T, Labor M, Jurić I, Dumană?ić D, Ilakovac V, Heffer M (2011) Comparison of emotion recognition from facial expression and music. Coll Antropol 35(1):163---167
[28]
Gerstman BB (2003) Statprimer. http://www.sjsu.edu/faculty/gerstman/StatPrimer/. Accessed 08 Nov 2014
[29]
Girard JM, Cohn JF, Mahoor MH, Mavadati S, Rosenwald DP (2013) Social risk and depression: evidence from manual and automatic facial expression analysis. In: Proceedings of IEEE international conference on automatic face and gesture recognition, pp 1---8.
[30]
Gonzalez R, Woods R (2008) Digital image processing, 3rd edn. Pearson/Prentice Hall, Upper Saddle River
[31]
Gunes H, Schuller B (2013) Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis Comput 31(2):120---136. (affect Analysis In Continuous Input)
[32]
Hamm J, Kohler C, Gur R, Verma R (2011) Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J Neurosci Methods 200(2):237---256.
[33]
Hess U, Philippot P, Blairy S (1998) Facial reactions to emotional facial expressions: Affect or cognition? Cogn Emotion 12(4):509---531.
[34]
Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC (2010) Expression intensity, gender and facial emotion recognition: women recognize only subtle facial emotions better than men. Acta Psychol 135(3):278---283.
[35]
Hu X (2010) Music and mood: where theory and reality meet. In: Proceedings of iConference
[36]
Huang CLC, Hsiao S, Hwu HG, Howng SL (2012) The chinese facial emotion recognition database (CFERD): a computer-generated 3-D paradigm to measure the recognition of facial emotional expressions at different intensities. Psychiatry Res 200(2---3):928---932.
[37]
Jack RE, Garrod OG, Schyns PG (2014) Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biol 24(2):187---192.
[38]
Jiang L, Qing Z, Wenyuan W (2000) A novel approach to analyze the result of polygraph. Proc IEEE Int Conf Syst Man Cybern 4:2884---2886.
[39]
Jongh E (2002) Fed: an online facial expression dictionary as a first step in the creation of a complete nonverbal dictionary. Master's thesis, Delft University of Technology
[40]
Juslin P (2013) From everyday emotions to aesthetic emotions: towards a unified theory of musical emotions. Phys Life Rev 10(3):235---266.
[41]
Kim K, Bang S, Kim S (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419---427.
[42]
Kobayashi H, Hara F (1991) The recognition of basic facial expressions by neural network. Proc IEEE Int Jt Conf Neural Netw 1:460---466.
[43]
Korsakova-Kreyn M, Dowling WJ (2012) Emotion in music: affective responses to motion in tonal space. In: Proceedings of the 12th international conference on music perception and cognition and the 8th triennial conference of the European society for the cognitive sciences of music, pp 23---28
[44]
Krumhansl C (1997) An exploratory study of musical emotions and psychophysiology. Can J Exp Psychol/Rev Can Psychol Exp 51(4):336---353.
[45]
Laurier C, Grivolla J, Herrera P (2008) Multimodal music mood classification using audio and lyrics. In: Proceedings of the international conference on machine learning and applications, San Diego, California, USA, pp 688---693.
[46]
Le Groux S, Valjamae A, Manzolli J, Verschure PF (2008) Implicit physiological interaction for the generation of affective musical sounds. In: Proceedings of the international computer music conference. Pompeu Fabra University, SemanticScholar, Barcelona, Spain
[47]
Lucey S, Ashraf AB, Cohn J (2007) Investigating spontaneous facial action recognition through aam representations of the face. In: Kurihara K (ed) Face recognition book, pp 275---286
[48]
Mehrabian A (1968) Communication without words, vol 2. Psychological Today, New York
[49]
Morris JD, Klahr NJ, Shen F, Villegas J, Wright P, He G, Liu Y (2009) Mapping a multidimensional emotion in response to television commercials. Hum Brain Mapp 30(3):789---796.
[50]
Nakanishi T, Kitagawa T (2006) Visualization of music impression in facial expression to represent emotion. Proc Asia Pac Conf Concept Model 53:55---64
[51]
Nauert R (2009) Women recognize emotions better. Psych Central http://psychcentral.com/news/2009/10/22/women-recognize-emotions-better/9100.html. Accessed 11 Oct 2014
[52]
OT?oole AJ, Harms J, Snow SL, Hurst DR, Pappas MR, Ayyad JH, Abdi H (2005) A video database of moving faces and people. IEEE Trans Pattern Anal Mach Intell 27(5):812---816.
[53]
Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: IEEE international conference on multimedia and Expo, ICME, pp 317---321.
[54]
Pease A, Pease B (2008) The definitive book of body language. Random House LLC, New York
[55]
Picard R, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23(10):1175---1191.
[56]
Posner J, Russell J, Peterson B (2005) The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(3):715---734.
[57]
Quan W, Matuszewski B, Shark L, Frowd C (2011) Methodology and performance analysis of 3-D facial expression recognition using statistical shape representation. Int J Grid Distrib Comput 4(3):79---88
[58]
Robin M, Pham-Scottez A, Curt F, Dugre-Le Bigre C, Speranza M, Sapinho D, Corcos M, Berthoz S, Kedia G (2012) Decreased sensitivity to facial emotions in adolescents with borderline personality disorder. Psychiatry Res 200(2):417---421.
[59]
Sariyanidi E, Gunes H, Cavallaro A (2014) Automatic analysis of facial affect: a survey of registration, representation and recognition. IEEE Trans Pattern Anal Mach Intell 99(PrePrints):1.
[60]
Savran A, Sankur B, Taha Bilge M (2012) Regression-based intensity estimation of facial action units. Image Vis Comput 30(10):774---784.
[61]
Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695---729
[62]
Scherer KR (2009) The dynamic architecture of emotion: evidence for the component process model. Cogn Emotion 23(7):1309---1316.
[63]
Schimmack U, Grob A (2000) Dimensional models of core affect: a quantitative comparison by means of structural equation modeling. Eur J Person 14(4):325---345.
[64]
Schubert E (1996) Continuous response to music using a two dimensional emotion space. In: Proceedings of the international conference of music perception and cognition, pp 263---268
[65]
Sloboda J, Juslin P (2001) Psychological perspectives on music and emotion. In: Music and emotion: theory and research, pp 71---104
[66]
Smeaton AF, Rothwell S (2009) Biometric responses to music-rich segments in films: the cdvplex. In: International workshop on content-based multimedia indexing, pp 162---168.
[67]
Sun S, Ge C (2014) A new method of 3D facial expression animation. J Appl Math 2014:1---6.
[68]
Tian Y, Kanade T, Cohn J (2001) Recognizing action units for facial expression analysis. IEEE Trans Pattern Anal Mach Intell 23(2):97---115.
[69]
Tian YL, Kanade T, Cohn JF (2005) Facial expression analysis. In: Handbook of face recognition, chap 11. Springer, pp 247---275
[70]
Trkulja M, Janković D (2012) Towards three-dimensional model of affective experience of music. Emotion 17:25---40
[71]
Tyler P (1996) Developing a two-dimensional continuous response space for emotions perceived in music. Ph.D. thesis, Florida State University
[72]
Valstar M, Pantic M (2012) Fully automatic recognition of the temporal phases of facial actions. IEEE Trans Syst Man Cybern 42(1):28---43.
[73]
Valstar MF, Pantic M, Ambadar Z, Cohn JF (2006) Spontaneous vs. posed facial behavior: automatic analysis of brow actions. In: Proceedings of ACM Int'l conference on multimodal interfaces, pp 162---170
[74]
Veloso L, Carvalho J, Cavalvanti C, Moura E, Coutinho F, Gomes H (2007) Neural network classification of photogenic facial expressions based on fiducial points and gabor features. In: Mery D, Rueda L (eds) Advances in image and video technology, lecture notes in computer science, vol 4872, pp 166---179.
[75]
Viera AJ, Garrett JM et al (2005) Understanding interobserver agreement: the kappa statistic. Fam Med 37(5):360---363
[76]
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 1:511---518.
[77]
Vukadinovic D, Pantic M (2005) Fully automatic facial feature point detection using gabor feature based boosted classifiers. IEEE Trans Syst Man Cybern 2:1692---1698.
[78]
Vuoskoski JK, Eerola T (2011) Measuring music-induced emotion a comparison of emotion models, personality biases, and intensity of experiences. Music Sci 15(2):159---173.
[79]
Wallbott HG, Scherer KR (1989) Assessing emotion by questionnaire. Emotion Theory Res Exp 4:55---82
[80]
Wimmer M, MacDonald B, Jayamuni D, Yadav A (2008) Facial expression recognition for human-robot interaction--a prototype. In: Sommer G, Klette R (eds) Robot vision, Lecture notes in computer science, vol 4931, pp 139---152.
[81]
Yang P, Liu Q, Metaxas DN (2007) Boosting coded dynamic features for facial action units and facial expression recognition. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 1---6.
[82]
Zeng Z, Fu Y, Roisman GI, Wen Z, Hu Y, Huang TS (2006) Spontaneous emotional facial expression detection. J Multimed 1(5):1---8
[83]
Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39---58.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Pattern Analysis & Applications
Pattern Analysis & Applications  Volume 22, Issue 2
May 2019
464 pages
ISSN:1433-7541
EISSN:1433-755X
Issue’s Table of Contents

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 25 May 2019

Author Tags

  1. Action units
  2. Emotion analysis
  3. Emotion models
  4. Face biometrics
  5. Facial expression analysis

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 24 Jan 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media