Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation

Published: 01 October 2023 Publication History

Abstract

Many areas in computer science are facing the need to analyze, quantify and reproduce movements expressing emotions. This paper presents a systematic review of the intelligible factors involved in the expression of emotions in human movement and posture. We have gathered the works that have studied and tried to identify these factors by sweeping many disciplinary fields such as psychology, biomechanics, choreography, robotics and computer vision. These researches have each used their own definitions, units and emotions, which prevents a global and coherent vision. We propose a meta-analysis approach that cross-references and aggregates these researches in order to have a unified list of expressive factors quantified for each emotion. A calculation method is then proposed for each of the expressive factors and we extract them from an emotionally annotated animation dataset: Emilya. The comparison between the results of the meta-analysis and the Emilya analysis reveals high correlation rates, which validates the relevance of the quantified values obtained by both methodologies. The analysis of the results raises interesting perspectives for future research in affective computing.

References

[1]
R. A. Khan, A. Crenn, A. Meyer, and S. Bouakaz, “A novel database of children's spontaneous facial expressions (LIRIS-CSE),” Image Vis. Comput., vol. 83–84, pp. 61–69, Mar. 2019.
[2]
A. Kleinsmith, P. R. De Silva, and N. Bianchi-Berthouze, “Cross-cultural differences in recognizing affect from body posture,” Interacting Comput., vol. 18, no. 6, pp. 1371–1389, Dec. 2006.
[3]
G. Bijlstra, R. W. Holland, R. Dotsch, and D. H. J. Wigboldus, “Stereotypes and prejudice affect the recognition of emotional body postures,” Emotion, vol. 19, no. 2, pp. 189–199, 2019.
[4]
T. A. Olugbade et al., “How can affect be detected and represented in technological support for physical rehabilitation?,” ACM Trans. Comput.-Hum. Interact., 2019, pp. 1–6.
[5]
J. Michalak, N. F. Troje, J. Fischer, P. Vollmar, T. Heidenreich, and D. Schulte, “Embodiment of sadness and depression—gait patterns associated with dysphoric mood,” Psychosomatic Med., vol. 71, no. 5, pp. 580–587, Jun. 2009.
[6]
T. Randhavane, U. Bhattacharya, K. Kapsaskis, K. Gray, A. Bera, and D. Manocha, “Identifying emotions from walking using affective and deep features,” 2020,.
[7]
A. Aristidou et al., “Emotion control of unstructured dance movements,” in Proc. ACM SIGGRAPH/Eurographics Symp. Comput. Animation, 2017, pp. 1–10.
[8]
K. Schuller, The Biopolitics of Feeling: Race, Sex, and Science in the Nineteenth Century. Durham, NC, USA: Duke Univ. Press, 2018.
[9]
S. Li and W. Deng, “Deep facial expression recognition: A survey,” IEEE Trans. Affective Comput., vol. 13, no. 3, pp. 1195–1215, Third Quarter, 2020.
[10]
A. Kleinsmith and N. Bianchi-Berthouze, “Affective body expression perception and recognition: A survey,” IEEE Trans. Affective Comput., vol. 4, no. 1, pp. 15–33, Jan. 2013.
[11]
R. Lun and W. Zhao, “A survey of applications and human motion recognition with microsoft kinect,” Int. J. Pattern Recognit. Artif. Intell., vol. 29, no. 5, Aug. 2015, Art. no.
[12]
I. Arun Faisal, T. Waluyo Purboyo, and A. Siswo Raharjo Ansori, “A review of accelerometer sensor and gyroscope sensor in IMU sensors on motion capture,” J. Eng. Appl. Sci., vol. 15, no. 3, pp. 826–829, Nov. 2019.
[13]
U. Bhattacharya, T. Mittal, R. Chandra, T. Randhavane, A. Bera, and D. Manocha, “STEP: Spatial temporal graph convolutional networks for emotion perception from gaits,” in Proc. AAAI Conf. Artif. Intell., vol. 34, no. 2, pp. 1342–1350, Apr. 2020.
[14]
A. Crenn, A. Meyer, H. Konik, R. A. Khan, and S. Bouakaz, “Generic body expression recognition based on synthesis of realistic neutral motion,” IEEE Access, vol. 8, pp. 207 758–207 767, 2020.
[15]
J. Huang and C. Pelachaud, “Expressive body animation pipeline for virtual agent,” in Proc. Int. Conf. Intell. Virtual Agents, 2012, pp. 355–362.
[16]
Y. Ding, K. Prepin, J. Huang, C. Pelachaud, and T. Artières, “Laughter animation synthesis,” in Proc. Int. Conf. Auton. Agents Multi-Agent Syst., 2014, pp. 773–780.
[17]
M. Neff and C. Pelachaud, “Animation of natural virtual characters,” IEEE Comput. Graph. Appl., vol. 37, no. 4, pp. 14–16, Apr. 2017.
[18]
T. Randhavane, A. Bera, K. Kapsaskis, R. Sheth, K. Gray, and D. Manocha, “EVA: Generating emotional behavior of virtual agents using expressive features of gait and gaze,” in Proc. ACM Symp. Appl. Perception., 2019, pp. 1–10.
[19]
T. Beinema et al., “Agents united: An open platform for multi-agent conversational systems,” in Proc. 21st ACM Int. Conf. Intell. Virtual Agents, 2021, pp. 17–24.
[20]
D. Chi, M. Costa, L. Zhao, and N. Badler, “The EMOTE model for effort and shape,” in Proc. 27th Annu. Conf. Comput. Graph. Interactive Techn., 2000, pp. 173–182.
[21]
R. S. Ziegelmaier, W. Correia, J. M. Teixeira, and F. P. M. Simoes, “Laban movement analysis applied to human-computer interaction,” in Proc. 22nd Symp. Virtual Augmented Reality, 2020, pp. 30–34.
[22]
K. Takahashi, M. Hosokawa, and M. Hashimoto, “Remarks on designing of emotional movement for simple communication robot,” in Proc. IEEE Int. Conf. Ind. Technol., 2010, pp. 585–590.
[23]
N. Fourati and C. Pelachaud, “Emilya: Emotional body expression in daily actions database,” in Proc. 9th Int. Conf. Lang. Resour. Eval., 2014, pp. 3486–3493.
[24]
N. Fourati and C. Pelachaud, “Perception of Emotions and Body Movement in the Emilya Database,” IEEE Trans. Affective Comput., vol. 9, no. 1, pp. 90–101, First Quarter 2018.
[25]
J. K. Burgoon, V. Manusov, and L. K. Guerrero, Nonverbal Commun., 2nd ed. New York: Routledge, Sep. 2021.
[26]
F. Bambaeeroo and N. Shokrpour, “The impact of the teachers’ non-verbal communication on success in teaching,” J. Adv. Med. Educ. Professionalism, vol. 5, no. 2, pp. 51–59, Apr. 2017.
[27]
A. J. Hale, J. Freed, D. Ricotta, G. Farris, and C. C. Smith, “Twelve tips for effective body language for medical educators,” Med. Teacher, vol. 39, no. 9, pp. 914–919, Sep. 2017.
[28]
P. Furley and G. Schweizer, “Body Language in Sport,” in Handbook of Sport Psychology, ch. 59. Hoboken, NJ, USA:, 2020, pp. 1201–1219.
[29]
M. Schlögl and C. A. Jones, “Maintaining our humanity through the mask: Mindful communication during covid-19,” J. Amer. Geriatr. Soc., vol. 68, no. 5, pp. E12–E13, May 2020.
[30]
A. W. Siegman and S. Feldstein, Eds., Nonverbal Behavior and Communication, 2nd ed. London, UK: Psychology Press, Dec. 2013.
[31]
E. De Stefani and D. De Marco, “Language, gesture, and emotional communication: An embodied view of social interaction,” Front. Psychol., vol. 10, 2019, Art. no.
[32]
R. von Laban and F. C. Lawrence, Effort. London, UK: Macdonald & Evans, 1947.
[33]
R. von Laban, The Mastery of Movement. London, UK: Macdonald and Evans, 1980.
[34]
F. Thomas and O. Johnston, Disney Animation: The Illusion of Life. New York City, USA: Abbeville Press, 1981.
[35]
A. Crenn, R. A. Khan, A. Meyer, and S. Bouakaz, “Body expression recognition from animated 3D skeleton,” in Proc. Int. Conf. 3D Imag., 2016, pp. 1–7.
[36]
T. Tao, X. Zhan, Z. Chen, and M. van de Panne, “Style-ERD: Responsive and coherent online motion style transfer,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 6593–6603.
[37]
D. Holden, I. Habibie, I. Kusajima, and T. Komura, “Fast neural style transfer for motion data,” IEEE Comput. Graph. Appl., vol. 37, no. 4, pp. 42–49, 2017.
[38]
S. Ribet, H. Wannous, and J.-P. Vandeborre, “Survey on style in 3D human body motion: Taxonomy, data, recognition and its applications,” IEEE Trans. Affective Comput., vol. 12, no. 4, pp. 928–948, Fourth Quarter, 2019.
[39]
S. Starke, H. Zhang, T. Komura, and J. Saito, “Neural state machine for character-scene interactions,” ACM Trans. Graph., vol. 38, no. 6, pp. 209:1–209:14, Nov. 2019.
[40]
S. Starke, Y. Zhao, T. Komura, and K. Zaman, “Local motion phases for learning multi-contact character movements,” ACM Trans. Graph., vol. 39, no. 4, pp. 54:54:1–54:54:13, Jul. 2020.
[41]
S. Xia, C. Wang, J. Chai, and J. Hodgins, “Realtime style transfer for unlabeled heterogeneous human motion,” ACM Trans. Graph., vol. 34, no. 4, 2015, Art. no.
[42]
M. E. Yumer and N. J. Mitra, “Spectral style transfer for human motion between independent actions,” ACM Trans. Graph., vol. 35, no. 4, pp. 137:1–137:8, Jul. 2016.
[43]
K. Aberman, Y. Weng, D. Lischinski, D. Cohen-Or, and B. Chen, “Unpaired motion style transfer from video to animation,” ACM Trans. Graph., vol. 39, no. 4, pp. 64:64:1–64:64:12, Jul. 2020.
[44]
F. Durupinar, M. Kapadia, S. Deutsch, M. Neff, and N. I. Badler, “Perform: Perceptual approach for adding ocean personality to human motion using laban movement analysis,” ACM Trans. Graph., vol. 36, no. 1, 2017, Art. no.
[45]
C. Beyan, S. Karumuri, G. Volpe, A. Camurri, and R. Niewiadomski, “Modeling multiple temporal scales of full-body movements for emotion classification,” IEEE Trans. Affective Comput., to be published.
[46]
H. Gunes, B. Schuller, M. Pantic, and R. Cowie, “Emotion representation, analysis and synthesis in continuous space: A survey,” in Face and Gesture 2011. Santa Barbara, CA, USA: IEEE, Mar. 2011, pp. 827–834.
[47]
R. A. Calvo and S. D’Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affective Comput., vol. 1, no. 1, pp. 18–37, Jan. 2010.
[48]
Z. Zeng, M. Pantic, G. I. Roisman, and T. S. Huang, “A survey of affect recognition methods: Audio, visual, and spontaneous expressions,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 31, no. 1, pp. 39–58, Jan. 2009.
[49]
M. Karg, A.-A. Samadani, R. Gorbet, K. Kuhnlenz, J. Hoey, and D. Kulic, “Body movements for affective expression: A survey of automatic recognition and generation,” IEEE Trans. Affective Comput., vol. 4, no. 4, pp. 341–359, Oct. 2013.
[50]
H. Zacharatos, C. Gatzoulis, and Y. L. Chrysanthou, “Automatic emotion recognition based on body movement analysis: A survey,” IEEE Comput. Graph. Appl., vol. 34, no. 6, pp. 35–45, Nov. 2014.
[51]
B. Stephens-Fripp, F. Naghdy, D. Stirling, and G. Naghdy, “Automatic affect perception based on body gait and posture: A survey,” Int. J. Social Robot., vol. 9, no. 5, pp. 617–641, Nov. 2017.
[52]
C. Larboulette and S. Gibet, “A review of computable expressive descriptors of human motion,” in Proc. 2nd Int. Workshop Movement Comput., 2015, pp. 21–28.
[53]
F. Noroozi, D. Kaminska, C. Corneanu, T. Sapinski, S. Escalera, and G. Anbarjafari, “Survey on emotional body gesture recognition,” IEEE Trans. Affective Comput., vol. 12, no. 2, pp. 505–523, Second Quarter 2018.
[54]
C. Rose, M. Cohen, and B. Bodenheimer, “Verbs and adverbs: Multidimensional motion interpolation,” IEEE Comput. Graph. Appl., vol. 18, no. 5, pp. 32–40, Sep. 1998.
[55]
F. Deligianni, Y. Guo, and G.-Z. Yang, “From emotions to mood disorders: A survey on gait analysis methodology,” IEEE J. Biomed. Health Inform., vol. 23, no. 6, pp. 2302–2316, Nov. 2019.
[56]
D. Moher, A. Liberati, J. Tetzlaff, and D. G. Altman, “Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement,” Ann. Intern. Med., vol. 151, no. 4, pp. 264–269, Aug. 2009.
[57]
A. Dessai and H. Virani, “Emotion classification using physiological signals: A recent survey,” in Proc. IEEE Int. Conf. Signal Process. Inform. Commun. Energy Syst., 2022, pp. 333–338.
[58]
P. Santhiya and S. Chitrakala, “A survey on emotion recognition from EEG signals: Approaches, techniques & challenges,” in Proc. Int. Conf. Vis. Towards Emerg. Trends Commun. Netw., 2019, pp. 1–6.
[59]
M. Moolchandani, S. Dwivedi, S. Nigam, and K. Gupta, “A survey on: Facial emotion recognition and classification,” in Proc. 5th Int. Conf. Comput. Methodol. Commun., 2021, pp. 1677–1686.
[60]
S. Latif, R. Rana, S. Khalifa, R. Jurdak, J. Qadir, and B. W. Schuller, “Survey of deep representation learning for speech emotion recognition,” IEEE Trans. Affective Comput., to be published.
[61]
N. M. Khair, H. Muthusamy, S. Yaacob, and S. N. Basah, “Recognition of emotions in gait patterns using discrete wavelet transform,” Int. J. Biomed. Clin. Eng., vol. 1, no. 1, pp. 86–93, Jan. 2012.
[62]
M. Daoudi, S. Berretti, P. Pala, Y. Delevoye, and A. Del Bimbo, “Emotion recognition by body movement representation on the manifold of symmetric positive definite matrices,” in Image Analysis and Processing - ICIAP 2017, ser. Lecture Notes in Computer Science, S. Battiato, G. Gallo, R. Schettini, and F. Stanco Eds., Berlin, Germany: Springer, 2017, pp. 550–560.
[63]
P. Barros, D. Jirak, C. Weber, and S. Wermter, “Multimodal emotional state recognition using sequence-dependent deep hierarchical features,” Neural Netw., vol. 72, pp. 140–151, Dec. 2015.
[64]
K. R. Scherer, “What are emotions? And how can they be measured?,” Social Sci. Inf., vol. 44, no. 4, pp. 695–729, Dec. 2005.
[65]
R. Cowie, N. Sussman, and A. Ben-Ze’ev, “Emotion: Concepts and definitions,” in Emotion-Oriented Systems: The Humaine Handbook, ser. Cognitive Technologies, R. Cowie, C. Pelachaud, and P. Petta, Eds., Berlin, Germany: Springer, 2011, pp. 9–30.
[66]
G. Gilam, J. J. Gross, T. D. Wager, F. J. Keefe, and S. C. Mackey, “What is the relationship between pain and emotion? bridging constructs and communities,” Neural, vol. 107, no. 1, pp. 17–21, Jul. 2020.
[67]
C. L. Bethel and R. R. Murphy, “Survey of non-facial/non-verbal affective expressions for appearance-constrained robots,” IEEE Trans. Syst., Man, Cybern., Part C (Appl. Rev., vol. 38, no. 1, pp. 83–92, Jan. 2008.
[68]
C. Vinola and K. Vimaladevi, “A survey on human emotion recognition approaches, databases and applications,” ELCVIA : Electron. Lett. Comput. Vis. Image Anal., vol. 14, no. 2, Dec. 2015, Art. no.
[69]
M. Imani and G. A. Montazer, “A survey of emotion recognition methods with emphasis on E-Learning environments,” J. Netw. Comput. Appl., vol. 147, 2019, Art. no.
[70]
P. A. Wilson and B. Lewandowska-Tomaszczyk, “Affective robotics: Modelling and testing cultural prototypes,” Cogn. Comput., vol. 6, no. 4, pp. 814–840, Dec. 2014.
[71]
H. Gunes and M. Piccardi, “Observer annotation of affective display and evaluation of expressivity: Face versus face-and-body,” in Proc. HCSNet Workshop Use Vis. Hum.-Comput. Interact., 2006, pp. 35–42.
[72]
S. Piana, A. Stagliano, A. Camurri, and F. Odone, “A set of full-body movement features for emotion recognition to help children affected by autism spectrum condition,” in Proc. IDGEI Int. Workshop, 2013.
[73]
H. Zacharatos, C. Gatzoulis, Y. Chrysanthou, and A. Aristidou, “Emotion recognition for exergames using laban movement analysis,” in Proceedings of Motion on Games, ser. MIG ’13. New York, NY, USA: Association for Computing Machinery, 2013, pp. 61–66.
[74]
C. Busso, Z. Deng, M. Grimm, U. Neumann, and S. Narayanan, “Rigid head motion in expressive speech animation: Analysis and synthesis,” IEEE Trans. Audio, Speech, Lang. Process., vol. 15, no. 3, pp. 1075–1086, Mar. 2007.
[75]
A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed, “Automatic recognition of non-acted affective postures,” IEEE Trans. Syst., Man, Cybern., Part B, vol. 41, no. 4, pp. 1027–1038, Aug. 2011.
[76]
C. L. Roether, L. Omlor, A. Christensen, and M. A. Giese, “Critical features for the perception of emotion from gait,” J. Vis., vol. 9, no. 6, pp. 15–15, Jun. 2009.
[77]
F. E. Pollick, H. M. Paterson, A. Bruderlin, and A. J. Sanford, “Perceiving affect from arm movement,” Cognition, vol. 82, no. 2, pp. B51–B61, Dec. 2001.
[78]
D. Glowinski, N. Dael, A. Camurri, G. Volpe, M. Mortillaro, and K. Scherer, “Toward a minimal representation of affective gestures,” IEEE Trans. Affective Comput., vol. 2, no. 2, pp. 106–118, Apr. 2011.
[79]
W. T. James, “A study of the expression of bodily posture,” J. Gen. Psychol., vol. 7, no. 2, pp. 405–437, Oct. 1932.
[80]
H. G. Wallbott, “Bodily expression of emotion,” Eur. J. Social Psychol., vol. 28, no. 6, pp. 879–896, 1998.
[81]
M. de Meijer, “The contribution of general features of body movement to the attribution of emotions,” J. Nonverbal Behav., vol. 13, no. 4, pp. 247–268, Dec. 1989.
[82]
H. G. Wallbott and K. R. Scherer, “Cues and channels in emotion recognition,” J. Pers. Social Psychol., vol. 51, no. 4, pp. 690–699, 1986.
[83]
J. M. Montepare, S. B. Goldstein, and A. Clausen, “The identification of emotions from gait information,” J. Nonverbal Behav., vol. 11, no. 1, pp. 33–42, Mar. 1987.
[84]
J. Montepare, E. Koff, D. Zaitchik, and M. Albert, “The use of body movements and gestures as cues to emotions in younger and older adults,” J. Nonverbal Behav., vol. 23, no. 2, pp. 133–152, Jun. 1999.
[85]
A.-A. Samadani, S. Burton, R. Gorbet, and D. Kulic, “Laban effort and shape analysis of affective hand and arm movements,” in Proc. Humaine Assoc. Conf. Affect. Comput. Intell. Interaction., 2013, pp. 343–348.
[86]
G. E. Kang and M. M. Gross, “The effect of emotion on movement smoothness during gait in healthy young adults,” J. Biomech., vol. 49, no. 16, pp. 4022–4027, Dec. 2016.
[87]
L. Omlor and M. A. Giese, “Extraction of spatio-temporal primitives of emotional body expressions,” Neurocomputing, vol. 70, no. 10, pp. 1938–1942, Jun. 2007.
[88]
M. M. Gross, E. A. Crane, and B. L. Fredrickson, “Effort-Shape and kinematic assessment of bodily expression of emotion during gait,” Hum. Movement Sci., vol. 31, no. 1, pp. 202–221, Feb. 2012.
[89]
M. M. Gross, E. A. Crane, and B. L. Fredrickson, “Methodology for assessing bodily expression of emotion,” J. Nonverbal Behav., vol. 34, no. 4, pp. 223–248, Dec. 2010.
[90]
S. Brownlow, A. R. Dixon, C. A. Egbert, and R. D. Radcliffe, “Perception of movement and dancer characteristics from point-light displays of dance,” Psychol. Rec., vol. 47, no. 3, pp. 411–422, Jul. 1997.
[91]
S. Dahl and A. Friberg, “Visual perception of expressiveness in musicians’ body movements,” Music Perception: An Interdiscipl. J., vol. 24, no. 5, pp. 433–454, Jun. 2007.
[92]
M. Sawada, K. Suda, and M. Ishii, “Expression of emotions in dance: Relation between arm movement characteristics and emotion,” Perceptual Motor Skills, vol. 97, no. 3, pp. 697–708, Dec. 2003.
[93]
M. Masuda, S. Kato, and H. Itoh, “Emotion detection from body motion of human form robot based on laban movement analysis,” in Principles of Practice in Multi-Agent Systems, ser. Lecture Notes in Computer Science, J.-J. Yang, M. Yokoo, T. Ito, Z. Jin, and P. Scerri, Eds., Berlin, Germany: Springer, 2009, pp. 322–334.
[94]
T. Nakata, “Analysis of impression of robot bodily expression,” J. Robot. Mechatronics, vol. 14, no. 1, pp. 27–36, 2002.
[95]
A. Kapandji, Anatomie Fonctionnelle. Volume 1. Membre Supérieur., 7th ed., vol. 1. Paris, France: Maloine, 2018.
[96]
A. Kapandji, Anatomie Fonctionnelle. Volume 2. Membre Inférieur., 7th ed., vol. 2. Paris, France: Maloine, 2018.
[97]
A. Kapandji, Anatomie Fonctionnelle. Volume 3. Tête et Rachis., 7th ed., vol. 3, Paris, France: Maloine, 2019.
[98]
L. Bishko, “Animation principles and Laban movement analysis: Movement frameworks for creating empathic character performances,” in Nonverbal Communication in Virtual Worlds: Understanding and Designing Expressive Characters. Pittsburgh, PA, USA: ETC Press, 2014, pp. 177–203.
[99]
L. Bishko, “Empathy in virtual worlds: Making characters believable with Laban movement analysis,” in Nonverbal Communication in Virtual Worlds: Understanding and Designing Expressive Characters. Pittsburgh, PA, USA: ETC Press, Jan. 2014, pp. 223–234.
[100]
L. Bishko, “Our empathic experience of believable characters,” in Nonverbal Communication in Virtual Worlds: Understanding and Designing Expressive Characters. Pittsburgh, PA, USA: ETC Press, Jan. 2014, pp. 47–59.
[101]
P. Ekman and D. Cordaro, “What is Meant by Calling Emotions Basic,” Emotion Rev., vol. 3, no. 4, pp. 364–370, Oct. 2011.
[102]
D. Glowinski, M. Mortillaro, K. Scherer, N. Dael, G. Volpe, and A. Camurri, “Towards a minimal representation of affective gestures (Extended abstract),” in Proc. Int. Conf. Affect. Comput. Intell. Interaction., 2015, pp. 498–504.
[103]
J. Russell, “A circumplex model of affect,” J. Pers. Social Psychol., vol. 39, no. 6, pp. 1161–1178, Dec. 1980.
[104]
A. Kleinsmith and N. Bianchi-Berthouze, “Recognizing affective dimensions from body posture,” in Affective Computing and Intelligent Interaction, ser. Lecture Notes in Computer Science, A. C. R. Paiva, R. Prada, and R. W. Picard, Eds., Berlin, Germany: Springer, 2007, pp. 48–58.
[105]
M. Meredith and S. Maddock, “Motion capture file formats explained,” Dept. Comput. Sci., Univ. Sheffield, vol. 211, pp. 241–244, 2001.
[106]
P. de Leva, “Adjustments to Zatsiorsky-Seluyanov's segment inertia parameters,” J. Biomech., vol. 29, no. 9, pp. 1223–1230, Sep. 1996.
[107]
D. A. Winter, Biomechanics and Motor Control of Human Movement. Hoboken, NJ, USA: Wiley, 2009.
[108]
M. L. Latash and V. M. Zatsiorsky, Biomechanics and Motor Control: Defining Central Concepts. New York, NY, USA: Elsevier Inc., 2015.
[109]
D. Weinland, R. Ronfard, and E. Boyer, “A survey of vision-based methods for action representation, segmentation and recognition,” Comput. Vis. Image Understanding, vol. 115, no. 2, pp. 224–241, Feb. 2011.
[110]
R. T. Arn, P. Narayana, T. Emerson, B. A. Draper, M. Kirby, and C. Peterson, “Motion segmentation via generalized curvatures,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 41, no. 12, pp. 2919–2932, Dec. 2019.
[111]
S. Fdili Alaoui, J. Françoise, T. Schiphorst, K. Studd, and F. Bevilacqua, “Seeing, sensing and recognizing laban movement qualities,” in Proc. CHI Conf. Hum. Factors Comput. Syst., 2017, pp. 4009–4020.
[112]
S. Fdili Alaoui, F. Bevilacqua, B. Bermudez Pascual, and C. Jacquemin, “Dance interaction with physical model visuals based on movement qualities,” Int. J. Arts Technol., vol. 6, no. 4, pp. 357–387, Jan. 2013.
[113]
A. Truong, “Analyse du contenu expressif des gestes corporels,” Theses, Institut National des Télécommunications, 2016.
[114]
M. Kapadia, I.-K. Chiang, T. Thomas, N. I. Badler, and J. T. Kider, “Efficient motion retrieval in large motion databases,” in Proc. ACM SIGGRAPH Symp. Interactive 3D Graph. Games, 2013, pp. 19–28.
[115]
T. Giraud, F. Focone, V. Demulier, J. C. Martin, and B. Isableu, “Perception of emotion and personality through full-body movement qualities: A sport coach case study,” ACM Trans. Appl. Percep., vol. 13, no. 1, pp. 2:1–2:27, Oct. 2015.
[116]
L. Santos, J. A. Prado, and J. Dias, “Human Robot interaction studies on laban human movement analysis and dynamic background segmentation,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2009, pp. 4984–4989.
[117]
G. Volpe, “Computational models of expressive gesture in multimedia systems,” Ph.D. dissertation, University of Genova, 2003.
[118]
E. Van Dyck, P.-J. Maes, J. Hargreaves, M. Lesaffre, and M. Leman, “Expressing induced emotions through free dance movement,” J. Nonverbal Behav., vol. 37, no. 3, pp. 175–190, Sep. 2013.
[119]
H. M. Mentis and C. Johansson, “Seeing movement qualities,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2013, pp. 3375–3384.
[120]
K. Hachimura, K. Takashina, and M. Yoshimura, “Analysis and evaluation of dancing movement based on LMA,” in Proc. IEEE Int. Workshop Robot Hum. Interactive Commun., 2005, pp. 294–299.
[121]
J. L. Tracy and R. W. Robins, “Show your pride: Evidence for a discrete emotion expression,” Psychol. Sci., vol. 15, no. 3, pp. 194–197, Mar. 2004.

Cited By

View all
  • (2024)Diffusion-Based Unsupervised Pre-training for Automated Recognition of Vitality FormsProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656689(1-9)Online publication date: 3-Jun-2024
  • (2024)Towards Context-Aware Emotion Recognition Debiasing From a Causal Demystification Perspective via De-Confounded TrainingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.344312946:12(10663-10680)Online publication date: 1-Dec-2024
  • (2024)Emotion Recognition From Full-Body Motion Using Multiscale Spatio-Temporal NetworkIEEE Transactions on Affective Computing10.1109/TAFFC.2023.330519715:3(898-912)Online publication date: 1-Jul-2024

Index Terms

  1. Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image IEEE Transactions on Affective Computing
          IEEE Transactions on Affective Computing  Volume 14, Issue 4
          Oct.-Dec. 2023
          832 pages

          Publisher

          IEEE Computer Society Press

          Washington, DC, United States

          Publication History

          Published: 01 October 2023

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 08 Feb 2025

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)Diffusion-Based Unsupervised Pre-training for Automated Recognition of Vitality FormsProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656689(1-9)Online publication date: 3-Jun-2024
          • (2024)Towards Context-Aware Emotion Recognition Debiasing From a Causal Demystification Perspective via De-Confounded TrainingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.344312946:12(10663-10680)Online publication date: 1-Dec-2024
          • (2024)Emotion Recognition From Full-Body Motion Using Multiscale Spatio-Temporal NetworkIEEE Transactions on Affective Computing10.1109/TAFFC.2023.330519715:3(898-912)Online publication date: 1-Jul-2024

          View Options

          View options

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media