Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3139513.3139521acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

An unobtrusive and multimodal approach for behavioral engagement detection of students

Published: 13 November 2017 Publication History

Abstract

In this paper, we investigate detection of students’ behavioral engagement states (On-Task vs. Off-Task) in authentic classroom settings. We propose a multimodal detection approach, based on three unobtrusive modalities readily available in a 1:1 learning scenario where learning technologies are incorporated. These modalities are: (1)Appearance: upper-body video captured using a camera; (2) Context-Performance: students’ interaction and performance data related to learning content; and (3) Mouse: data related to mouse movements during learning process. For each modality, separate unimodal classifiers were trained, and decision-level fusion was applied to obtain final behavioral engagement states. We also analyzed each modality based on Instructional and Assessment sections separately (i.e., Instructional where a student is reading an article or watching an instructional video vs. Assessment where a student is solving exercises on the digital learning platform). We carried out various experiments on a dataset collected in an authentic classroom, where students used laptops equipped with cameras and they consumed learning content for Math on a digital learning platform. The dataset included multimodal data of 17 students who attended a Math course for 13 sessions (40 minutes each). The results indicate that it is beneficial to have separate classification pipelines for Instructional and Assessment sections: For Instructional, using only Appearance modality yields an F1-measure of 0.74, compared to fused performance of 0.70. For Assessment, fusing all three modalities (F1-measure of 0.89) provide a prominent improvement over the best performing unimodality (i.e., 0.81 for Appearance).

References

[1]
R. M. Carini, G. D. Kuh, and S. P. Klein. 2006. Student engagement and student learning: Testing the linkages. Research in Higher Education, 47(1), 1-32.
[2]
J. A. Fredricks, P. C. Blumenfeld, A. H. Paris. 2004. School engagement: potential of the concept, state of the evidence. Review of Education Research, 74(1), 59-109.
[3]
S. Aslan, Z. Cataltepe, I. Diner, O. Dundar, A. Arslan Esme, R. Ferens, G. Kamhi, E. Oktay, C. Soysal, M. Yener. 2014. Learner engagement measurement and classification in 1:1 learning. International Conference on Machine Learning and Applications (ICMLA).
[4]
N. Alyuz, E. Okur, E. Oktay, U. Genc, S. Aslan, S. E. Mete, D. Stanhill, B. Arnrich and A. Arslan Esme. 2016. Towards an emotional engagement model: can affective states of a learner be automatically detected in a 1:1 learning scenario. International Conference on User Modeling, Adaptation, and Personalization (UMAP) - Workshop on Personalization Approaches in Learning Environments (PALE).
[5]
N. Alyuz, E. Okur, E. Oktay, U. Genc, S. Aslan, S. E. Mete, B. Arnrich and A. Arslan Esme. 2016. Semi-supervised model personalization for improved detection of learner’s emotional engagement. International Conference on Multimodal Interaction (ICMI), 100-107.
[6]
E. Okur, N. Alyuz, S. Aslan, U. Genc, C. Tanriover, A. Arslan Esme. 2017. Behavioral engagement detection of students in the wild. International Conference on Artificial Intelligence in Education (AIED), 250-261. An Unobtrusive and Multimodal Approach for Behavioral Engagement Detection of Students MIE’17, November 13, 2017, Glasgow, UK
[7]
R. Pekrun, L. Linnenbrink-Garcia. 2012. Academic emotions and student engagement. Handbook of Research on Student Engagement, 259-282.
[8]
M. M. T. Rodrigo, R. S. J. D. Baker, L. Rossi. 2013. Student off-task behavior in computer-based learning in the Philippines: comparison to prior research in the USA. Teachers College Record, 115(10), 1-27.
[9]
N. Mana, and O. Mich. 2013. Towards the design of technology for measuring and capturing children’s attention on e-learning tasks. International Conference on Interaction Design and Children (IDC).
[10]
J. Chen, N. Luo, Y. Liu, L. Liu, K. Zhang, and J. Kolodziej. 2016. A hybrid intelligence-aided approach to affect-sensitive e-learning. Computing 98(1-2), 215-233.
[11]
D. Rosengrant. 2013. Using eye-trackers to study student attention in physical science classes. Bull. Am. Phys. Soc. 58. Working Paper. 30.
[12]
N. Bosch, S. D’Mello, R. Baker, J. Ocumpaugh, V. Shute, M. Ventura, L. Wang, and W. Zhao. 2015. Automatic detection of learning-centered affective states in the wild. International Conference on Intelligent User Interfaces (IUI), 379-388.
[13]
A. Kapoor and R. W. Picard. 2005. Multimodal affect recognition in learning environments. International Conference on Multimedia.
[14]
Y. Su, C. Hsu, H. Chen, K. Huang, and Y. Huang. 2014. Developing a sensorbased learning concentration detection system. Engineering Computations, 31(2), 216-230.
[15]
B. Woolf, W. Burleson, I. Arroyo, T. Dragon, D. Cooper, and R. Picard. 2009. Affect-aware tutors: recognising and responding to student affect. International Journal of Learning Technology 4(3-4), 129-164.
[16]
S. Aslan, S. E. Mete, E. Okur, E. Oktay, N. Alyuz, U. Genc, D. Stanhill, A. Arslan Esme. 2017. Human expert labeling process (HELP): towards a reliable higher order user state labeling process and tool to assess student engagement. Educational Technology, 57(1), 53-59.
[17]
D. E. King. 2009. Dlib-ml: A Machine Learning Toolkit. Journal of Machine Learning Research, 10, pp. 1755-1758.
[18]
V. Kazemi and J. Sullivan. 2014. One millisecond face alignment with an ensemble of regression trees. Conference on Computer Vision and Pattern Recognition (CVPR).
[19]
G. Bradski. 2000. The OpenCV Library. Dr. Dobb’s Journal of Software Tools.
[20]
Moving Pictures Expert Group, MPEG-4 International standard, ISO/IEC 14496.
[21]
Z. A. Pardos, R. S. Baker, M. O. San Pedro, S. M. Gowda and S. M. Gowda. 2013. Affective states and state tests: investigating how affect throughout the school year predicts end of year learning outcomes. International Conference on Learning Analytics and Knowledge (LAK).
[22]
M. Christ, A. W. Kempa-Liehr, and M. Feindt. 2017. Tsfresh, GitHub repository, https://github.com/blue-yonder/tsfresh.
[23]
C. Chen, A. Liaw and L. Breiman. 2004. Using random forest to learn imbalanced data, University of California, Berkeley.
[24]
B. Gokberk, and L. Akarun. 2006. Comparative analysis of decision-level fusion algorithms for 3D face recognition. International Conference on Pattern Recognition (ICPR), Vol. 3.

Cited By

View all
  • (2024)Revisiting Annotations in Online Student EngagementProceedings of the 2024 10th International Conference on Computing and Data Engineering10.1145/3641181.3641186(111-117)Online publication date: 15-Jan-2024
  • (2024)TCA-NET: Triplet Concatenated-Attentional Network for Multimodal Engagement Estimation2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10647692(2062-2068)Online publication date: 27-Oct-2024
  • (2024)Toward an Interactive Reading Experience: Deep Learning Insights and Visual Narratives of Engagement and EmotionIEEE Access10.1109/ACCESS.2024.335074512(6001-6016)Online publication date: 2024
  • Show More Cited By

Index Terms

  1. An unobtrusive and multimodal approach for behavioral engagement detection of students

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MIE 2017: Proceedings of the 1st ACM SIGCHI International Workshop on Multimodal Interaction for Education
      November 2017
      75 pages
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 13 November 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Adaptive Learning
      2. Behavioral Engagement
      3. Intelligent Tutoring Systems
      4. Multimodal Fusion

      Qualifiers

      • Research-article

      Conference

      ICMI '17
      Sponsor:

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)40
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 15 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Revisiting Annotations in Online Student EngagementProceedings of the 2024 10th International Conference on Computing and Data Engineering10.1145/3641181.3641186(111-117)Online publication date: 15-Jan-2024
      • (2024)TCA-NET: Triplet Concatenated-Attentional Network for Multimodal Engagement Estimation2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10647692(2062-2068)Online publication date: 27-Oct-2024
      • (2024)Toward an Interactive Reading Experience: Deep Learning Insights and Visual Narratives of Engagement and EmotionIEEE Access10.1109/ACCESS.2024.335074512(6001-6016)Online publication date: 2024
      • (2022)The Evidence of Impact and Ethical Considerations of Multimodal Learning Analytics: A Systematic Literature ReviewThe Multimodal Learning Analytics Handbook10.1007/978-3-031-08076-0_12(289-325)Online publication date: 9-Oct-2022
      • (2021)Predicting Student Engagement in the Online Learning EnvironmentInternational Journal of Web-Based Learning and Teaching Technologies10.4018/IJWLTT.28709516:6(1-21)Online publication date: 25-Oct-2021
      • (2021)MuMIA: Multimodal Interactions to Better Understand Art ContextsApplied Sciences10.3390/app1106269511:6(2695)Online publication date: 17-Mar-2021
      • (2021)Characterizing Student Engagement Moods for Dropout Prediction in Question Pool WebsitesProceedings of the ACM on Human-Computer Interaction10.1145/34490865:CSCW1(1-22)Online publication date: 22-Apr-2021
      • (2021)Characterizing Academic Help-seeking Moods for Enrollment Performance of Institutional Online StudentProcedia Computer Science10.1016/j.procs.2021.09.163192(3885-3894)Online publication date: 2021
      • (2021)Design Considerations for a Multiple Smart-Device-Based Personalized Learning EnvironmentAdvances in Intelligent Systems Research and Innovation10.1007/978-3-030-78124-8_8(173-184)Online publication date: 3-Nov-2021
      • (2020)Multimodal Data Fusion in Learning Analytics: A Systematic ReviewSensors10.3390/s2023685620:23(6856)Online publication date: 30-Nov-2020
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media