Abstract
In order to achieve accurate judgment and identification of group behaviors, the hierarchical deep network model is constructed to judge the group behaviors. Through the construction of the hierarchical deep network model, the group behaviors are judged; the stability, accuracy, expression movement, orientation, error, and work efficiency of the model, as well as the support vector machine model and the convolution network model, are compared and analyzed. The hierarchical depth network model has distinct advantages in comparison with the support vector machine model and the convolution network model. The standard deviation of the hierarchical depth network model is 0.013, while the standard deviations of the other two models are larger than the hierarchical depth network model. The proposed algorithm has an excellent performance in detection accuracy and error. Compared with the other two models, it has certain advantages. In addition, the hierarchical depth network is used for recognizing human behaviors and orientations, as well as extracting and recovering the expressions in group behaviors. Compared with the other two models, the proposed model is also more efficient. The model used in this study has little influence on group behavior recognition by the regional environment and other factors, and there is no significant difference in the judgment results. The operation of the model is studied by identifying the group behaviors based on the hierarchical depth network. The research results show that the model proposed in this study has a comprehensive and excellent result, which also indicates that group behavior recognition is an overall result. It is necessary to have an accurate identification of multiple layer parameters. The research in this study has greatly improved the understanding of hierarchical deep network and group behavior recognition.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Lee J, Jin L, Park D et al (2016) Automatic recognition of aggressive behavior in pigs using a kinect depth sensor. Sensors 16(5):631
Barros P, Parisi GI, Weber C et al (2017) Emotion-modulated attention improves expression recognition: a deep learning model. Neurocomputing 253:104–114
Liu W, Wang Z, Liu X et al (2017) A survey of deep neural network architectures and their applications. Neurocomputing 234:11–26
Kubilius J, Bracci S, de Beeck HPO (2016) Deep neural networks as a computational model for human shape sensitivity. PLoS Comput Biol 12(4):e1004896
Ordóñez F, Roggen D (2016) Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 16(1):115
Min S, Lee B, Yoon S (2017) Deep learning in bioinformatics. Brief Bioinform 18(5):851–869
Kim BK, Roh J, Dong SY et al (2016) Hierarchical committee of deep convolutional neural networks for robust facial expression recognition. J Multimodal User Interfaces 10(2):173–189
Cha YJ, Choi W, Büyüköztürk O (2017) Deep learning-based crack damage detection using convolutional neural networks. Comput Aided Civ Infrastruct Eng 32(5):361–378
Mabrouk AB, Zagrouba E (2018) Abnormal behavior recognition for intelligent video surveillance systems: a review. Expert Syst Appl 91:480–491
Han F, Reily B, Hoff W et al (2017) Space-time representation of people based on 3D skeletal data: a review. Comput Vis Image Underst 158:85–105
Liu HL, Taniguchi T, Tanaka Y et al (2017) Visualization of driving behavior based on hidden feature extraction by using deep learning. IEEE Trans Intell Transp Syst 18(9):2477–2489
Hassan MM, Uddin MZ, Mohamed A et al (2018) A robust human activity recognition system using smartphone sensors and deep learning. Future Gener Comput Syst 81:307–313
Shen D, Wu G, Suk HI (2017) Deep learning in medical image analysis. Annu Rev Biomed Eng 19:221–248
Sargano A, Angelov P, Habib Z (2017) A comprehensive review on handcrafted and learning-based action representation approaches for human activity recognition. Appl Sci 7(1):110
Prakash C, Kumar R, Mittal N (2018) Recent developments in human gait research: parameters, approaches, applications, machine learning techniques, datasets and challenges. Artif Intell Rev 49(1):1–40
Ranasinghe S, Al Machot F, Mayr HC (2016) A review on applications of activity recognition systems with regard to performance and evaluation. Int J Distrib Sens Netw. https://doi.org/10.1177/1550147716665520
Chong E, Han C, Park FC (2017) Deep learning networks for stock market analysis and prediction: methodology, data representations, and case studies. Expert Syst Appl 83:187–205
Nunez JC, Cabido R, Pantrigo JJ et al (2018) Convolutional neural networks and long short-term memory for skeleton-based human activity and hand gesture recognition. Pattern Recognit 76:80–94
Kahou SE, Bouthillier X, Lamblin P et al (2016) Emonets: multimodal deep learning approaches for emotion recognition in video. J Multimodal User Interfaces 10(2):99–111
Romei V, Thut G, Silvanto J (2016) Information-based approaches of noninvasive transcranial brain stimulation. Trends Neurosci 39(11):782–795
Prieto A, Prieto B, Ortigosa EM et al (2016) Neural networks: an overview of early research, current frameworks and new challenges. Neurocomputing 214:242–268
Acknowledgements
This work was supported by the National Natural Science Foundation of Shandong Province (ZR2018BF005), Project of Art Science in Shandong Province (201806506).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no conflict of interest. No conflict of interest exits in the submission of this manuscript, and manuscript is approved by all authors for publication.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Qiao, S., Wang, L. & Gao, Z. Group behavior recognition based on deep hierarchical network. Neural Comput & Applic 32, 5389–5398 (2020). https://doi.org/10.1007/s00521-019-04699-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-019-04699-4