Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2678025.2701397acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Automatic Detection of Learning-Centered Affective States in the Wild

Published: 18 March 2015 Publication History

Abstract

Affect detection is a key component in developing intelligent educational interfaces that are capable of responding to the affective needs of students. In this paper, computer vision and machine learning techniques were used to detect students' affect as they used an educational game designed to teach fundamental principles of Newtonian physics. Data were collected in the real-world environment of a school computer lab, which provides unique challenges for detection of affect from facial expressions (primary channel) and gross body movements (secondary channel) - up to thirty students at a time participated in the class, moving around, gesturing, and talking to each other. Results were cross validated at the student level to ensure generalization to new students. Classification was successful at levels above chance for off-task behavior (area under receiver operating characteristic curve or (AUC = .816) and each affective state including boredom (AUC =.610), confusion (.649), delight (.867), engagement (.679), and frustration (.631) as well as a five-way overall classification of affect (.655), despite the noisy nature of the data. Implications and prospects for affect-sensitive interfaces for educational software in classroom environments are discussed.

References

[1]
Allison, P. D. Multiple regression: A primer. Pine Forge Press, 1999.
[2]
Arroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., and Christopherson, R. Emotion sensors go to school. AIED, (2009), 17--24.
[3]
Baker, R., Gowda, S. M., Wixon, M., et al. Towards sensor-free affect detection in cognitive tutor algebra. Proceedings of the 5th International Conference on Educational Data Mining, (2012), 126--133.
[4]
Baker, R., Rodrigo, M. M. T., and Xolocotzin, U. E. The dynamics of affective transitions in simulation problem-solving environments. In A. C. R. Paiva, R. Prada and R.W. Picard, eds., Affective Computing and Intelligent Interaction. Springer, Berlin Heidelberg, 2007, 666--677.
[5]
Bosch, N., Chen, Y., and D'Mello, S. It's written on your face: detecting affective states from facial expressions while learning computer programming. Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), Switzerland: Springer International Publishing (2014), 39--44.
[6]
Calvo, R. A. and D'Mello, S. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on Affective Computing 1, 1 (2010), 18--37.
[7]
Chawla, N. V., Bowyer, K. W., Hall, L. O., and Kegelmeyer, W. P. SMOTE: synthetic minority oversampling technique. Journal of Artificial Intelligence Research 16, (2011), 321--357.
[8]
Cook, D. L. The Hawthorne effect in educational research. Phi Delta Kappan, (1962), 116--122.
[9]
Craig, S., Graesser, A., Sullins, J., and Gholson, B. Affect and learning: An exploratory look into the role of affect in learning with AutoTutor. Journal of Educational Media 29, 3 (2004), 241--250.
[10]
Dhall, A., Goecke, R., Joshi, J., Wagner, M., and Gedeon, T. Emotion recognition in the wild challenge 2013. Proceedings of the 15th ACM on International Conference on Multimodal Interaction, ACM (2013), 509--516.
[11]
D'Mello, S. Dynamical emotions: bodily dynamics of affect during problem solving. Proceedings of the 33rd Annual Conference of the Cognitive Science Society, (2011).
[12]
D'Mello, S. A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. Journal of Educational Psychology 105, 4 (2013), 1082--1099.
[13]
D'Mello, S., Blanchard, N., Baker, R., Ocumpaugh, J., and Brawner, K. I feel your pain: A selective review of affect-sensitive instructional strategies. In R. Sottilare, A. Graesser, X. Hu and B. Goldberg, eds., Design Recommendations for Intelligent Tutoring Systems Volume 2: Instructional Management. 2014, 35--48.
[14]
D'Mello, S. and Graesser, A. The half-life of cognitive-affective states during complex learning. Cognition & Emotion 25, 7 (2011), 1299--1308.
[15]
D'Mello, S. and Graesser, A. Dynamics of affective states during complex learning. Learning and Instruction 22, 2 (2012), 145--157.
[16]
Ekman, P., Freisen, W. V., and Ancoli, S. Facial signs of emotional experience. Journal of Personality and Social Psychology 39, 6 (1980), 1125--1134.
[17]
Ekman, P. and Friesen, W. V. Facial action coding system. Consulting Psychologist Press, (1978), Palo Alto, CA.
[18]
Fawcett, T. An introduction to ROC analysis. Pattern Recognition Letters 27, 8 (2006), 861--874.
[19]
Frenzel, A. C., Pekrun, R., and Goetz, T. Perceived learning environment and students' emotional experiences: A multilevel analysis of mathematics classrooms. Learning and Instruction 17, 5 (2007), 478--493.
[20]
Grafsgaard, J. F., Wiggins, J. B., Boyer, K. E., Wiebe, E. N., and Lester, J. C. Automatically recognizing facial indicators of frustration: A learning-centric analysis. (2013).
[21]
Hernandez, J., Hoque, M. (Ehsan), Drevo, W., and Picard, R. W. Mood meter: Counting smiles in the wild. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, ACM (2012), 301--310.
[22]
Holmes, G., Donkin, A., and Witten, I. H. WEKA: a machine learning workbench. Proceedings of the Second Australian and New Zealand Conference on Intelligent Information Systems, (1994), 357--361.
[23]
Hoque, M. E., McDuff, D., and Picard, R. W. Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing 3, 3 (2012), 323--334.
[24]
Jeni, L., Cohn, J. F., and de la Torre, F. Facing imbalanced data - Recommendations for the use of performance metrics. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), (2013), 245--251.
[25]
Kapoor, A., Burleson, W., and Picard, R. W. Automatic prediction of frustration. International Journal of Human-Computer Studies 65, 8 (2007), 724--736.
[26]
Kapoor, A. and Picard, R. W. Multimodal affect recognition in learning environments. Proceedings of the 13th Annual ACM International Conference on Multimedia, ACM (2005), 677--682.
[27]
Kononenko, I. Estimating attributes: Analysis and extensions of RELIEF. In F. Bergadano and L. D. Raedt, eds., Machine Learning: ECML-94. Springer, Berlin Heidelberg, 1994, 171--182.
[28]
Lepper, M. R., Woolverton, M., Mumme, D. L., and Gurtner, J. Motivational techniques of expert human tutors: Lessons for the design of computer-based tutors. Computers as cognitive tools 1993, (1993), 75--105.
[29]
Littlewort, G., Whitehill, J., Wu, T., et al. The computer expression recognition toolbox (CERT). 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), (2011), 298--305.
[30]
McDaniel, B. T., D'Mello, S., King, B. G., Chipman, P., Tapp, K., and Graesser, A. Facial features for affective state detection in learning environments. Proceedings of the 29th Annual Cognitive Science Society, (2007), 467--472.
[31]
McDuff, D., El Kaliouby, R., Senechal, T., Amr, M., Cohn, J. F., and Picard, R. Affectiva - MIT facial expression dataset (AM-FED): Naturalistic and spontaneous facial expressions collected in-the-wild. 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), (2013), 881--888.
[32]
McDuff, D., El Kaliouby, R., Senechal, T., Demirdjian, D., and Picard, R. Automatic measurement of ad preferences from facial responses gathered over the Internet. Image and Vision Computing 32, 10 (2014), 630--640.
[33]
Mota, S. and Picard, R. W. Automated posture analysis for detecting learner's interest level. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW '03), (2003), 49--56.
[34]
Ocumpaugh, J., Baker, R., Kamarainen, A., and Metcalf, S. Modifying field observation methods on the fly: Creative metanarrative and disgust in an environmental MUVE. Proceedings of the 4th International Workshop on Personalization Approaches in Learning Environments (PALE), held in conjunction with the 22nd International Conference on User Modeling, Adaptation, and Personalization (UMAP 2014), (2014), 49--54.
[35]
Ocumpaugh, J., Baker, R., and Rodrigo, M. M. T. Baker-Rodrigo observation method protocol (BROMP) 1.0. Training manual version 1.0. Technical Report. New York, NY: EdLab. Manila, Philippines: Ateneo Laboratory for the Learning Sciences, 2012.
[36]
Porayska-Pomsta, K., Mavrikis, M., D'Mello, S., Conati, C., and Baker, R. Knowledge elicitation methods for affect modelling in education. International Journal of Artificial Intelligence in Education 22, 3 (2013), 107--140.
[37]
Reisenzein, R., Studtmann, M., and Horstmann, G. Coherence between emotion and facial expression: Evidence from laboratory experiments. Emotion Review 5, 1 (2013), 16--23.
[38]
Schutz, P. and Pekrun, R., eds. Emotion in Education. Academic Press, San Diego, CA, 2007.
[39]
Senechal, T., Bailly, K., and Prevost, L. Impact of action unit detection in automatic emotion recognition. Pattern Analysis and Applications 17, 1 (2014), 51--67.
[40]
Shute, V. J., Ventura, M., and Kim, Y. J. Assessment and learning of qualitative physics in Newton's Playground. The Journal of Educational Research 106, 6 (2013), 423--430.
[41]
Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., and Movellan, J. R. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing 5, 1 (2014), 86--98.
[42]
Zeng, Z., Pantic, M., Roisman, G. I., and Huang, T. S. A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 1 (2009), 39--58.

Cited By

View all
  • (2024)Revisiting Annotations in Online Student EngagementProceedings of the 2024 10th International Conference on Computing and Data Engineering10.1145/3641181.3641186(111-117)Online publication date: 15-Jan-2024
  • (2024)Predicting Student Engagement Using Sequential Ensemble ModelIEEE Transactions on Learning Technologies10.1109/TLT.2023.334286017(939-950)Online publication date: 1-Jan-2024
  • (2024)Facial Emotion Recognition and Detection2024 OPJU International Technology Conference (OTCON) on Smart Computing for Innovation and Advancement in Industry 4.010.1109/OTCON60325.2024.10687569(1-5)Online publication date: 5-Jun-2024
  • Show More Cited By

Index Terms

  1. Automatic Detection of Learning-Centered Affective States in the Wild

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IUI '15: Proceedings of the 20th International Conference on Intelligent User Interfaces
    March 2015
    480 pages
    ISBN:9781450333061
    DOI:10.1145/2678025
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 March 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affect detection
    2. classroom data
    3. in the wild
    4. naturalistic facial expressions

    Qualifiers

    • Research-article

    Funding Sources

    • NSF
    • Bill & Melinda Gates Foundation

    Conference

    IUI'15
    Sponsor:

    Acceptance Rates

    IUI '15 Paper Acceptance Rate 47 of 205 submissions, 23%;
    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Upcoming Conference

    IUI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)88
    • Downloads (Last 6 weeks)13
    Reflects downloads up to 28 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Revisiting Annotations in Online Student EngagementProceedings of the 2024 10th International Conference on Computing and Data Engineering10.1145/3641181.3641186(111-117)Online publication date: 15-Jan-2024
    • (2024)Predicting Student Engagement Using Sequential Ensemble ModelIEEE Transactions on Learning Technologies10.1109/TLT.2023.334286017(939-950)Online publication date: 1-Jan-2024
    • (2024)Facial Emotion Recognition and Detection2024 OPJU International Technology Conference (OTCON) on Smart Computing for Innovation and Advancement in Industry 4.010.1109/OTCON60325.2024.10687569(1-5)Online publication date: 5-Jun-2024
    • (2024)Measuring student behavioral engagement using histogram of actionsPattern Recognition Letters10.1016/j.patrec.2024.11.002186(337-344)Online publication date: Oct-2024
    • (2024)Leveraging part-and-sensitive attention network and transformer for learner engagement detectionAlexandria Engineering Journal10.1016/j.aej.2024.06.074107(198-204)Online publication date: Dec-2024
    • (2024)Improving collaborative problem-solving skills via automated feedback and scaffolding: a quasi-experimental study with CPSCoach 2.0User Modeling and User-Adapted Interaction10.1007/s11257-023-09387-634:4(1087-1125)Online publication date: 14-Feb-2024
    • (2024)Class-attention video transformer for engagement predictionMultimedia Tools and Applications10.1007/s11042-024-20350-4Online publication date: 12-Oct-2024
    • (2024)Artificial intelligence based cognitive state prediction in an e-learning environment using multimodal dataMultimedia Tools and Applications10.1007/s11042-023-18021-xOnline publication date: 16-Jan-2024
    • (2024)A New Approach for Counting and Identification of Students Sentiments in Online Virtual Environments Using Convolutional Neural NetworksAdvances in Computational Intelligence. MICAI 2023 International Workshops10.1007/978-3-031-51940-6_4(29-40)Online publication date: 20-Jan-2024
    • (2023)A Survey on Facial Emotion Identification using Deep Learning ModelsAdvances in Computational Intelligence in Materials Science10.53759/acims/978-9914-9946-9-8_3(12-16)Online publication date: 7-Jun-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media