Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Enhancing frame-level student engagement classification through knowledge transfer techniques

Published: 02 February 2024 Publication History

Abstract

Assessing student engagement in educational settings is critical for monitoring and improving the learning process. Traditional methods that classify video-based student engagement datasets often assign a single engagement label to the entire video, resulting in inaccurate classification outcomes. However, student engagement varies over time, with fluctuations in concentration and interest levels. To overcome this limitation, this paper introduces a frame-level student engagement detection approach. By analyzing each frame individually, instructors gain more detailed insights into students’ understanding of the course. The proposed method focuses on identifying student engagement at a granular level, enabling instructors to pinpoint specific moments of disengagement or high engagement for targeted interventions. Nevertheless, the lack of labeled frame-level engagement data presents a significant challenge. To address this challenge, we propose a novel approach for frame-level student engagement classification by leveraging the concept of knowledge transfer. Our method involves pretraining a deep learning model on a labeled image-based student engagement dataset, WACV, which serves as the base dataset for identifying frame-level engagement in our target video-based DAiSEE dataset. We then fine-tune the model on the unlabeled video dataset, utilizing the transferred knowledge to enhance engagement classification performance. Experimental results demonstrate the effectiveness of our frame-level approach, providing valuable insights for instructors to optimize instructional strategies and enhance the learning experience. This research contributes to the advancement of student engagement assessment, offering educators a more nuanced understanding of student behaviors during instructional videos.

References

[1]
Christenson S, Reschly AL, Wylie C et al (2012) Handbook of research on student engagement, vol 840. Springer, ???
[2]
Doherty K and Doherty G Engagement in hci: conception, theory and measurement ACM Comput Surv (CSUR) 2018 51 5 1-39
[3]
Liu T, Wang J, Yang B, and Wang X Facial expression recognition method with multi-label distribution learning for non-verbal behavior understanding in the classroom Infrared Physics & Technology 2021 112
[4]
Zhang Z, Li Z, Liu H, Cao T, and Liu S Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology J Educ Comput Res 2020 58 1 63-86
[5]
Dewan M, Murshed M, and Lin F Engagement detection in online learning: a review Smart Learning Environments 2019 6 1 1-20
[6]
Karimah SN, Hasegawa S (2021) Automatic engagement recognition for distance learning systems: a literature study of engagement datasets and methods. In: International conference on human-computer interaction. Springer, pp 264–276
[7]
Ekman P, Friesen WV (1978) Facial action coding system. Environmental Psychology & Nonverbal Behavior
[8]
Velusamy S, Kannan H, Anand B, Sharma A, Navathe B (2011) A method to infer emotions from facial action units. In: 2011 IEEE International conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 2028–2031
[9]
Khorrami P, Paine T, Huang T (2015) Do deep neural networks learn facial action units when doing expression recognition? In: Proceedings of the IEEE international conference on computer vision workshops, pp 19–27
[10]
Huang W, Yang Y, Huang X, Peng Z, Xiong L (2022) Emotion-cause pair extraction based on interactive attention. Appl Intell, 1–11
[11]
Fredricks JA, Reschly AL, Christenson SL (2019) Interventions for student engagement: overview and state of the field. Handbook of student engagement interventions, 1–11
[12]
Bhardwaj P, Gupta P, Panwar H, Siddiqui MK, Morales-Menendez R, and Bhaik A Application of deep learning on student engagement in e-learning environments Comput Electr Eng 2021 93 107277
[13]
Kaur A, Mustafa A, Mehta L, Dhall A (2018) Prediction and localization of student engagement in the wild. In: 2018 Digital image computing: techniques and applications (DICTA). IEEE, pp 1–8
[14]
Mohamad Nezami O, Dras M, Hamey L, Richards D, Wan S, Paris C (2020) Automatic recognition of student engagement using deep learning and facial expression. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 273–289
[15]
Whitehill J, Serpell Z, Lin Y-C, Foster A, and Movellan JR The faces of engagement: automatic recognition of student engagementfrom facial expressions IEEE Trans Affect Comput 2014 5 1 86-98
[16]
Batra S, Wang H, Nag A, Brodeur P, Checkley M, Klinkert A, and Dev S Dmcnet: diversified model combination network for understanding engagement from video screengrabs Systems and Soft Computing 2022 4
[17]
Abedi A, Khan SS (2021) Improving state-of-the-art in detecting student engagement with resnet and tcn hybrid network. In: 2021 18th Conference on robots and vision (CRV). IEEE, pp 151–157
[18]
Mehta NK, Prasad SS, Saurav S, Saini R, and Singh S Three-dimensional densenet self-attention neural network for automatic detection of student’s engagement Appl Intell 2022 52 12 13803-13823
[19]
Thomas C, Sarma KP, Gajula SS, and Jayagopi DB Automatic prediction of presentation style and student engagement from videos Computers and Education: Artif Intell 2022 3 100079
[20]
Karimah SN and Hasegawa S Automatic engagement estimation in smart education/learning settings: a systematic review of engagement definitions, datasets, and methods Smart Learning Environments 2022 9 1 1-48
[21]
Yun W-H, Lee D, Park C, Kim J, and Kim J Automatic recognition of children engagement from facial video using convolutional neural networks IEEE Trans Affect Comput 2018 11 4 696-707
[22]
Wang X, Liu T, Wang J, and Tian J Understanding learner continuance intention: a comparison of live video learning, pre-recorded video learning and hybrid video learning in covid-19 pandemic Int J Hum Comput Interact 2022 38 3 263-281
[23]
Liu T, Wang J, Yang B, and Wang X Ngdnet: nonuniform gaussian-label distribution learning for infrared head pose estimation and on-task behavior understanding in the classroom Neurocomputing 2021 436 210-220
[24]
Lu X, Wang W, Ma C, Shen J, Shao L, Porikli F (2019) See more, know more: Unsupervised video object segmentation with co-attention siamese networks. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 3623–3632
[25]
Qin Z, Lu X, Nie X, Liu D, Yin Y, and Wang W Coarse-to-fine video instance segmentation with factorized conditional appearance flows IEEE/CAA Journal of Automatica Sinica 2023 10 5 1192-1208
[26]
Lu X, Wang W, Shen J, Crandall DJ, and Van Gool L Segmenting objects from relational visual data IEEE Trans Pattern Anal Mach Intell 2021 44 11 7885-7897
[27]
Gupta A, D’Cunha A, Awasthi K, Balasubramanian V (2016) Daisee: towards user engagement recognition in the wild. arXiv preprint arXiv:1609.01885
[28]
Liao J, Liang Y, and Pan J Deep facial spatiotemporal network for engagement prediction in online learning Appl Intell 2021 51 6609-6621
[29]
Selim T, Elkabani I, and Abdou MA Students engagement level detection in online e-learning using hybrid efficientnetb7 together with tcn, lstm, and bi-lstm IEEE Access 2022 10 99573-99583
[30]
Hu Y, Jiang Z, and Zhu K An optimized cnn model for engagement recognition in an e-learning environment Appl Sci 2022 12 16 8007
[31]
Booth BM, Ali AM, Narayanan SS, Bennett I, Farag AA (2017) Toward active and unobtrusive engagement assessment of distance learners. In: 2017 Seventh international conference on affective computing and intelligent interaction (ACII). IEEE, pp 470–476
[32]
Chen X, Niu L, Veeraraghavan A, and Sabharwal A Faceengage: robust estimation of gameplay engagement from user-contributed (youtube) videos IEEE Trans Affect Comput 2019 13 2 651-665
[33]
Abedi A, Thomas C, Jayagopi DB, Khan SS (2023) Bag of states: a non-sequential approach to video-based engagement measurement. arXiv preprint arXiv:2301.06730
[34]
Copur O, Nakıp M, Scardapane S, Slowack J (2022) Engagement detection with multi-task training in e-learning environments. In: Image analysis and processing-ICIAP 2022: 21st International conference, Lecce, Italy, proceedings, Part III. Springer, pp 411–422. Accessed 23-27 May 2022
[35]
Abedi A, Khan SS (2023) Affect-driven ordinal engagement measurement from video. Multimedia Tools and Applications, 1–20
[36]
Khan SS, Colella TJ: Inconsistencies in measuring user engagement in virtual learning–a critical
[37]
De Carolis B, D’Errico F, Macchiarulo N, Palestra G (2019) “engaged faces”: measuring and monitoring student engagement from face and gaze behavior. In: IEEE/WIC/ACM International conference on web intelligence-companion volume, pp 80–85
[38]
D’Mello S and Graesser A Dynamics of affective states during complex learning Learn Instr 2012 22 2 145-157
[39]
Baker RSd, Rodrigo MMT, Xolocotzin UE (2007) The dynamics of affective transitions in simulation problem-solving environments. In: Affective computing and intelligent interaction: second inter- national conference, ACII 2007 Lisbon, Portugal, Proceedings 2. Springer, pp 666–677. Accessed 12-14 Sept 2007
[40]
Baltrusaitis T, Zadeh A, Lim YC, Morency L-P (2018) Openface 2.0: facial behavior analysis toolkit. In: 2018 13th IEEE International conference on automatic face & gesture recognition (FG 2018). IEEE, pp 59–66
[41]
Affectiva (2022) Humanizing technology. https://www.affectiva. com/. Accessed 25 May 2023
[42]
Software F (2007) Facial expression recognition software: Fac- eReader. https://www.noldus.com/. Accessed 25 May 2023
[43]
Buono P, De Carolis B, D’Errico F, Macchiarulo N, and Palestra G Assessing student engagement from facial behavior in on-line learning Multimedia Tools and Applications 2023 82 9 12859-12877
[44]
Alkabbany I, Ali A, Farag A, Bennett I, Ghanoum M, Farag A (2019) Measuring student engagement level using facial information. In: 2019 IEEE International conference on image processing (ICIP). IEEE, pp 3337–3341
[45]
Thomas C, Jayagopi DB (2017) Predicting student engagement in classrooms using facial behavioral cues. In: Proceedings of the 1st ACM SIGCHI International workshop on multimodal interaction for education, pp 33–40
[46]
Das R, Dev S (2023) On facial feature extraction for engagement recognition. Signal Processing: Image Communication (Under review)
[47]
Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd international conference on knowledge discovery and data mining, pp 785–794
[48]
Upadhyay H, Kamat Y, Phansekar S, Hole V (2021) User engagement recognition using transfer learning and multi-task classification. In: Intelligent data communication technologies and internet of things: proceedings of ICICI 2020. Springer, pp 411–420
[49]
Karan K, Bahel V, Ranjana R, Subha T (2022) Transfer learning approach for analyzing attentiveness of students in an online classroom environment with emotion detection. In: Innovations in computational intelligence and computer vision: proceedings of ICICV 2021. Springer, ???, pp 253–261
[50]
Zheng X, Hasegawa S, Tran M-T, Ota K, Unoki T (2021) Estimation of learners’ engagement using face and body features by transfer learning. In: International conference on human-computer interaction. Springer, pp 541–552
[51]
Ikram S, Ahmad H, Mahmood N, Faisal CN, Abbas Q, Qureshi I, and Hussain A Recognition of student engagement state in a classroom environment using deep and efficient transfer learning algorithm Appl Sci 2023 13 15 8637
[52]
Bougourzi F, Dornaika F, Barrena N, Distante C, Taleb-Ahmed A (2022) Cnn based facial aesthetics analysis through dynamic robust losses and ensemble regression. Appl Intell, 1–18
[53]
Kiranyaz S, Avci O, Abdeljaber O, Ince T, Gabbouj M, and Inman DJ 1d convolutional neural networks and applications: a survey Mech Syst Signal Process 2021 151
[54]
Torrey L, Shavlik J (2010) Transfer learning. In: Handbook of research on machine learning applications and trends: algorithms, methods, and techniques. IGI global, ???, pp 242–264
[55]
Yang Q, Zhang Y, Dai W, Pan SJ (2020) Transfer learning. Cambridge University Press, ???
[56]
Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, Xiong H, and He Q A comprehensive survey on transfer learning Proc IEEE 2020 109 1 43-76
[57]
Murshed M, Dewan MAA, Lin F, Wen D (2019) Engagement detection in e-learning environments using convolutional neural networks. In: 2019 IEEE Intl conf on dependable, autonomic and secure computing, Intl conf on pervasive intelligence and computing, Intl conf on cloud and big data computing, Intl conf on cyber science and technology congress (DASC/PiCom/CBDCom/CyberSciTech). IEEE, pp 80–86

Index Terms

  1. Enhancing frame-level student engagement classification through knowledge transfer techniques
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image Applied Intelligence
        Applied Intelligence  Volume 54, Issue 2
        Jan 2024
        1129 pages

        Publisher

        Kluwer Academic Publishers

        United States

        Publication History

        Published: 02 February 2024
        Accepted: 26 December 2023

        Author Tags

        1. Student engagement classification
        2. Knowledge transfer
        3. Online education
        4. Facial feature
        5. Action units

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 0
          Total Downloads
        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 15 Jan 2025

        Other Metrics

        Citations

        View Options

        View options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media