Abstract
Gait Emotion Recognition is an emerging research domain that focuses on the automatic detection of emotions from a person’s manner of walking. Deep learning-based methodologies have been proven highly effective for computer vision tasks. This paper provides a powerful deep-learning architecture for emotion recognition from gait by introducing the fusion of domain-specific discriminative features with latent deep features. The proposed Bi-Modal Deep Neural Network (BMDNN) combines salient features extracted from a deep neural network with highly-discriminating handcrafted features. The proposed architecture outperforms state-of-the-art methods in all emotional classes on the Edinburgh Locomotion MoCap Dataset.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ahmed, F., Paul, P.P., Gavrilova, M.L.: DTW-based kernel and rank-level fusion for 3D gait recognition using Kinect. Vis. Comput. 31(6), 915–924 (2015)
Meyer, B.M., et al.: Wearables and deep learning classify fall risk from gait in multiple sclerosis. J. Biomed. Health Inform. 25(5), 1824–1831 (2020)
Gavrilova, M.L., et al.: Multi-modal motion-capture-based biometric systems for emergency response and patient rehabilitation. In: Research Anthology on Rehabilitation Practices and Therapy, pp. 653–678. IGI Global, USA (2021)
Zhuang, J.R., Wu, G.Y., Lee, H.H., Tanaka, E.: Applying the Interaction of Walking-Emotion to an Assistive Device for Rehabilitation and Exercise. In: IROS, pp. 6489–6494. IEEE, China (2019)
Narayanan, V., Manoghar, B.M., Dorbala, V.S., Manocha, D., Bera, A.: ProxEmo: gait-based emotion learning and multi-view proxemic fusion for socially-aware robot navigation. In: IROS, pp. 8200–8207. IEEE, USA (2020)
Menolotto, M., Komaris, D.S., Tedesco, S., O’Flynn, B., Walsh, M.: Motion capture technology in industrial applications: a systematic review. Sensors 20(19), 5687 (2020)
Bari, H., Gavrilova, M.: Artificial neural network based gait recognition using kinect sensor. IEEE Access 7(1), 162708–162722 (2019)
Artacho, B., Savakis, A.: OmniPose: a multi-scale framework for multi-person pose estimation. arXiv preprint (2021)
Xu, S., et al.: emotion recognition from gait analyses: current research and future directions. arXiv preprint (2020)
Karg, M., Kühnlenz, K., Buss, M.: Recognition of affect based on gait patterns. IEEE Trans. Syst. Man Cybern. Part B 40(4), 1050–1061 (2010)
Talebi, H., Hoang, W., Gavrilova, M.L.: Multi-scale foreign exchange rates ensemble for classification of trends in forex market. Proc. Comput. Sci. 29, 2065–2075 (2014)
Bhattacharya, U., Mittal, T., Chandra, R., Randhavane, T., Bera, A., Manocha, D.: STEP: spatial temporal graph convolutional networks for emotion perception from gaits. In: AAAI, pp. 1342–1350. AAAI, USA (2020)
Habibie, I., Holden, D., Schwarz, J., Yearsley, J., Komura, T.: A recurrent variational autoencoder for human motion synthesis. In: 28th British Machine Vision Conference, pp. 1–11. BMVC, UK (2017)
Janssen, D., Schöllhorn, W.I., Lubienetzki, J., Fölling, K., Kokenge, H., Davids, K.: Recognition of emotions in gait patterns by means of artificial neural nets. J. Nonverbal Behav. 32(2), 79–92 (2008)
Venture, G., Kadone, H., Zhang, T., Grèzes, J., Berthoz, A., Hicheur, H.: Recognizing emotions conveyed by human gait. Int. J. Soc. Robot. 6(4), 621–632 (2014)
Li, S., Cui, L., Zhu, C., Li, B., Zhao, N., Zhu, T.: Emotion recognition using Kinect motion capture data of human gaits. PeerJ 4, e2364 (2016)
Zhang, Z., Song, Y., Cui, L., Liu, X., Zhu, T.: Emotion recognition based on customized smart bracelet with built-in accelerometer. PeerJ 4, e2258 (2016)
Ahmed, F., Sieu, B., Gavrilova, M.L.: Score and rank-level fusion for emotion recognition using genetic algorithm. In: ICCI*CC, pp. 46–53. IEEE, USA (2018)
Yan, S., Xiong, Y., Lin, D.: Spatial temporal graph convolutional networks for skeleton-based action recognition. In: AAAI Conference on Artificial Intelligence, USA, pp. 1–9 (2018)
Randhavane, T., Bhattacharya, U., Kapsaskis, K., Gray, K., Bera, A., Manocha, D.: Identifying emotions from walking using affective and deep features. arXiv preprint (2019)
Bhattacharya, U., et al.: Take an emotion walk: perceiving emotions from gaits using hierarchical attention pooling and affective mapping. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12355, pp. 145–163. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58607-2_9
Laban, R.V.: The Mastery of Movement. 3rd edn. (1971)
Levy, J.A., Duke, M.P.: The use of Laban movement analysis in the study of personality, emotional state and movement style: an exploratory investigation of the veridicality of “body language’’. Individ. Differ. Res. 1(1), 39–63 (2003)
Albert, M.: Nonverbal Communication. Routledge, UK (2017)
Bhatia, Y., Bari, A.S.M.H., Hsu, G.S.J., Gavrilova, M.: Motion capture sensor-based emotion recognition using a bi-modular sequential neural network. Sensors 22(1), 403–423 (2022)
Acknowledgment
The authors acknowledge the Natural Sciences and Engineering Research Council (NSERC) Discovery Grant funding, as well as the NSERC Strategic Partnership Grant (SPG) and the Innovation for Defense Excellence and Security Network (IDEaS) for the partial funding of this project.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Bhatia, Y., Bari, A.S.M.H., Gavrilova, M. (2022). Gait Emotion Recognition Using a Bi-modal Deep Neural Network. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2022. Lecture Notes in Computer Science, vol 13598. Springer, Cham. https://doi.org/10.1007/978-3-031-20713-6_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-20713-6_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20712-9
Online ISBN: 978-3-031-20713-6
eBook Packages: Computer ScienceComputer Science (R0)