Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2930238.2930371acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
abstract
Public Access

Detecting Student Engagement: Human Versus Machine

Published: 13 July 2016 Publication History

Abstract

Engagement is complex and multifaceted, but crucial to learning. Computerized learning environments can provide a superior learning experience for students by automatically detecting student engagement (and, thus also disengagement) and adapting to it. This paper describes results from several previous studies that utilized facial features to automatically detect student engagement, and proposes new methods to expand and improve results. Videos of students will be annotated by third-party observers as mind wandering (disengaged) or not mind wandering (engaged). Automatic detectors will also be trained to classify the same videos based on students' facial features, and compared to the machine predictions. These detectors will then be improved by engineering features to capture facial expressions noted by observers and more heavily weighting training instances that were exceptionally-well classified by observers. Finally, implications of previous results and proposed work are discussed.

References

[1]
Ryan Shaun Baker, Albert T. Corbett, Kenneth R. Koedinger, and Angela Z. Wagner. 2004. Off-task behavior in the cognitive tutor classroom: When students "game the system." Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 383--390.
[2]
Marian Stewart Bartlett, Gwen C. Littlewort, Mark G. Frank, and Kang Lee. 2014. Automatic decoding of facial movements reveals deceptive pain expressions. Current biology: CB 24, 7: 738--743.
[3]
Nathaniel Blanchard, Robert Bixler, Tera Joyce, and Sidney D'Mello. 2014. Automated physiological-based detection of mind wandering during learning. Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), Switzerland: Springer International Publishing, 55--60.
[4]
Nigel Bosch, Yuxuan Chen, and Sidney D'Mello. 2014. It's written on your face: Detecting affective states from facial expressions while learning computer programming. Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), Switzerland: Springer International Publishing, 39--44.
[5]
Nigel Bosch, Sidney D'Mello, Ryan Baker, et al. 2015. Automatic detection of learning-centered affective states in the wild. Proceedings of the 2015 International Conference on Intelligent User Interfaces (IUI 2015), New York, NY: ACM, 379--388.
[6]
Michael Buhrmester, Tracy Kwang, and Samuel D. Gosling. 201 Amazon's mechanical turk a new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science 6, 1: 3--5.
[7]
Rafael A. Calvo and Sidney D'Mello. 2010. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on Affective Computing 1, 1: 18--37.
[8]
Sidney D'Mello. 2013. A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. Journal of Educational Psychology 105, 4: 1082--1099.
[9]
Sidney D'Mello and Art Graesser. 2012. Dynamics of affective states during complex learning. Learning and Instruction 22, 2: 145--157.
[10]
Joseph F. Grafsgaard, Joseph B. Wiggins, Kristy Elizabeth Boyer, Eric N. Wiebe, and James C. Lester. 2013. Automatically recognizing facial expression: Predicting engagement and frustration. Proceedings of the 6th International Conference on Educational Data Mining.
[11]
Mohammed (Ehsan) Hoque, Daniel McDuff, and Rosalind W. Picard. 2012. Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing 3, 3: 323--334.
[12]
Ashish Kapoor, Winslow Burleson, and Rosalind W. Picard. 2007. Automatic prediction of frustration. International Journal of Human-Computer Studies 65, 8: 724--736.
[13]
Gwen Littlewort, J. Whitehill, Tingfan Wu, et al. 2011. The computer expression recognition toolbox (CERT). 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), 298--305.
[14]
Hamed Monkaresi, Nigel Bosch, Rafael A. Calvo, and Sidney K. D'Mello. in press. Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Transactions on Affective Computing.
[15]
Zachary A. Pardos, Ryan S. J. D. Baker, Maria O. C. Z. San Pedro, Sujith M. Gowda, and Supreeth M. Gowda. 2013. Affective states and state tests: Investigating how affect throughout the school year predicts end of year learning outcomes. Proceedings of the Third International Conference on Learning Analytics and Knowledge, ACM, 117--124.
[16]
Erik D. Reichle, Andrew E. Reineberg, and Jonathan W. Schooler. 2010. Eye movements during mindless reading. Psychological Science 21, 9: 1300--1310.
[17]
Jonathan Smallwood, Daniel J. Fishman, and Jonathan W. Schooler. 2007. Counting the cost of an absent mind: Mind wandering as an underrecognized influence on educational performance. Psychonomic Bulletin & Review 14, 2: 230--236.
[18]
J. Whitehill, Z. Serpell, Yi-Ching Lin, A Foster, and J.R. Movellan. 2014. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing 5, 1: 86--98.

Cited By

View all
  • (2024)Addressing Class Imbalances in Video Time-Series Data for Estimation of Learner Engagement: “Over Sampling with Skipped Moving Average”Education Sciences10.3390/educsci1406055614:6(556)Online publication date: 22-May-2024
  • (2024)Deep Learning Based Approach For Detecting Student Engagement Through Facial Emotions2024 International Conference on Data Science and Network Security (ICDSNS)10.1109/ICDSNS62112.2024.10691098(1-6)Online publication date: 26-Jul-2024
  • (2024)Investigating features that play a role in predicting gifted student engagement using machine learning: Video log and self-report dataEducation and Information Technologies10.1007/s10639-024-12490-929:13(16317-16343)Online publication date: 8-Feb-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UMAP '16: Proceedings of the 2016 Conference on User Modeling Adaptation and Personalization
July 2016
366 pages
ISBN:9781450343688
DOI:10.1145/2930238
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 July 2016

Check for updates

Author Tags

  1. affective computing
  2. engagement detection
  3. facial expressions

Qualifiers

  • Abstract

Funding Sources

Conference

UMAP '16
Sponsor:
UMAP '16: User Modeling, Adaptation and Personalization Conference
July 13 - 17, 2016
Nova Scotia, Halifax, Canada

Acceptance Rates

UMAP '16 Paper Acceptance Rate 21 of 123 submissions, 17%;
Overall Acceptance Rate 162 of 633 submissions, 26%

Upcoming Conference

UMAP '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)188
  • Downloads (Last 6 weeks)25
Reflects downloads up to 01 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Addressing Class Imbalances in Video Time-Series Data for Estimation of Learner Engagement: “Over Sampling with Skipped Moving Average”Education Sciences10.3390/educsci1406055614:6(556)Online publication date: 22-May-2024
  • (2024)Deep Learning Based Approach For Detecting Student Engagement Through Facial Emotions2024 International Conference on Data Science and Network Security (ICDSNS)10.1109/ICDSNS62112.2024.10691098(1-6)Online publication date: 26-Jul-2024
  • (2024)Investigating features that play a role in predicting gifted student engagement using machine learning: Video log and self-report dataEducation and Information Technologies10.1007/s10639-024-12490-929:13(16317-16343)Online publication date: 8-Feb-2024
  • (2024)Bag of states: a non-sequential approach to video-based engagement measurementMultimedia Systems10.1007/s00530-023-01244-130:1Online publication date: 28-Jan-2024
  • (2023)Student Engagement Awareness in an Asynchronous E-Learning EnvironmentInternational Journal of Technology-Enabled Student Support Services10.4018/IJTESSS.31621112:1(1-19)Online publication date: 6-Jan-2023
  • (2023)Student-Engagement Detection in Classroom Using Machine Learning AlgorithmElectronics10.3390/electronics1203073112:3(731)Online publication date: 1-Feb-2023
  • (2023)Multimodal Engagement Analysis From Facial Videos in the ClassroomIEEE Transactions on Affective Computing10.1109/TAFFC.2021.312769214:2(1012-1027)Online publication date: 1-Apr-2023
  • (2023)Detection and Intervention Engagement Service Development for New Normal Distance Learning2023 International Seminar on Intelligent Technology and Its Applications (ISITIA)10.1109/ISITIA59021.2023.10221195(433-437)Online publication date: 26-Jul-2023
  • (2023)Engagement Analysis of Learners Using Emotions: The SimEng System2023 10th International and the 16th National Conference on E-Learning and E-Teaching (ICeLeT)10.1109/ICeLeT58996.2023.10139905(1-6)Online publication date: 28-Feb-2023
  • (2023)Lightweight Model for Emotion Detection from Facial Expression in Online Learning2023 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE)10.1109/CCECE58730.2023.10288951(174-179)Online publication date: 24-Sep-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media