Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3284179.3284230acmotherconferencesArticle/Chapter ViewAbstractPublication PagesteemConference Proceedingsconference-collections
research-article

A comparison of students' emotional self-reports with automated facial emotion recognition in a reading situation

Published: 24 October 2018 Publication History

Abstract

This study investigated the measurement of students' emotional states during a common learning activity, digital reading of factual texts. The objective was to compare emotional self-reports with automated facial emotion recognition. The latter promises non-intrusive measurements of emotions, which could inform adaptive learning systems. We used an established facial emotion recognition software trained on experts' ratings of facial expressions (FaceReader). For basic emotions, previous studies have reported high agreement of the software with human raters. However, little evidence exists a) on its performance for the epistemic emotions of interest and boredom, b) on its agreement with self-reports, and c) in naturalistic reading situations. We compared the facial expression-based recognition of interest, boredom, and valence of affect to students' self-reports of those emotional states. Analyses of webcam recordings of 103 students revealed no relationship between facial emotion recognition and self-reports. Due to the low agreement of the facial emotion recognition software with self-reports, it remains unclear what the facial expression-based recognition of interest, boredom, and valence actually implies. We advise to wait for more comprehensive evidence a) on the agreement of facial emotion recognition software with self-reports or b) on its predictive validity for learning before applying it in educational practice (e.g., in adaptive learning systems).

References

[1]
Affectiva Homepage: https://www.affectiva.com/. Accessed: 2018-06-01.
[2]
Aleven, V. et al. 2016. Instruction based on adaptive learning technologies. In R. E. Mayer & P. Alexander (Eds.), Handbook of Research on Learning and Instruction. Routledge.
[3]
Amos, B. et al. 2016. Openface: A general-purpose face recognition library with mobile applications.
[4]
Atkinson, R.C. 1976. Adaptive Instructional Systems: Some attempts to optimize the learning process. Stanford University, Institute for Mathematical Studies in the Social Sciences.
[5]
Bates, D. et al. 2015. Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1--48.
[6]
Bonanno, G.A. and Keltner, D. 2004. The coherence of emotion systems: Comparing "online" measures of appraisal and facial expressions, and self-report. Cognition and Emotion, 18(3), 431--444.
[7]
Brodny, G. et al. 2016. Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions. Proceedings -- 2016 9th International Conference on Human System Interactions, HSI 2016, 397--404.
[8]
Ekman, P. and Cordaro, D. 2011. What is meant by calling emotions basic. Emotion Review, 3(4), 364--370.
[9]
Ekman, P. and Friesen, W. V. 1976. Measuring facial movement. Environmental Psychology and Nonverbal Behavior, 1, 56--75.
[10]
Essa, A. 2016. A possible future for next generation adaptive learning systems. Smart Learning Environments, 3.
[11]
Feldman Barrett, L. et al. 2005. Interoceptive sensitivity and self-reports of emotional experience. Journal of Personality and Social Psychology, 87(5), 684--697.
[12]
Flesch, R. 1948. A new readability yardstick. Journal of Applied Psychology, 32(3), 221--233.
[13]
Harley, J.M. et al. 2013. Aligning and comparing data on emotions experienced during learning with metatutor. In H.C. Lane et al. (eds), Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science, vol 7926.Springer. 61--70.
[14]
Krapp, A. et al. 1991. Interest, learning, and development. The role of interest in learning and development. Erlbaum. 3--25.
[15]
Lang, P.J. 1980. Behavioral treatment and bio-behavioral assessment: computer applications. In J.B. Sidowski et al. (eds), Technology in mental health care delivery systems. Ablex. 119--137.
[16]
Lewinski, P. et al. 2014. Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader. Journal of Neuroscience, Psychology, and Economics, 8(1), 58--59.
[17]
Lewinski, P. 2015. Automated facial coding software outperforms people in recognizing neutral faces as neutral from standardized datasets. Frontiers in Psychology, 6(1386).
[18]
Lewinski, P. 2015. Don't look blank, happy, or sad: Patterns of facial expressions of speakers in banks' YouTube Videos predict video's popularity over time. Journal of Neuroscience, Psychology, and Economics, 8(4), 1--9.
[19]
Loijens, L. and Krips, O. 2018. FaceReader Methodology Note. A white paper by Noldus Information Technology.
[20]
Mathôt, S. et al. 2012. OpenSesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44(2), 314--324.
[21]
Moors, A. 2013. Appraisal theories of emotion: State of the art and future development, Emotion Review. 5(2), 119--124.
[22]
Noldus Homepage: https://www.noldus.com/facereader/whats-new-facereader-71. Accessed: 2018-05--18.
[23]
Pekrun, R. et al. 2016. Measuring emotions during epistemic activities: The Epistemically-Related Emotion Scales. Cognition & Emotion, 31(6), 1--9.
[24]
R Core Team 2018. R: A language and environment for statistical computing. 2018.
[25]
Russell, J.A. 1980. A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161--1178.
[26]
Scherer, K.R. 2005. What are emotions? and how can they be measured? Social Science Information, 44(4), 695--729.
[27]
Soleymani, M. 2016. Detecting cognitive appraisals from facial expressions for interest recognition. preprint arXiv.
[28]
Soleymani, M. and Mortillaro, M. 2018. Behavioral and Physiological Responses to Visual Interest and Appraisals: Multimodal Analysis and Automatic Recognition. Frontiers in ICT, 5, 17.
[29]
Suhr, Y.T. 2017. FaceReader, a promising instrument for measuring facial emotion expression? A comparison to facial electromyography and self-reports. Master thesis, Utrecht University.
[30]
Suk, H.-J. 2006. Color and emotion - a study on the affective judgment across media and in relation to visual stimuli. Doctoral dissertation, University of Mannheim.
[31]
Tze, V.M.C. et al. 2015. Evaluating the relationship between boredom and academic outcomes: A meta-analysis. Educational Psychology Review, 28(1), 119--144.
[32]
Vandewaetere, M. et al. 2011. The contribution of learner characteristics in the development of computer-based adaptive learning environments. Computers in Human Behavior, 27, 118--130.
[33]
Wauters, K. et al. 2010. Adaptive item-based learning environments based on the item response theory: Possibilities and challenges. Journal of Computer Assisted Learning, 26(6), 549--562.
[34]
Zimmermann, P. et al. 2003. Affective computing - A rationale for measuring mood with mouse and keyboard. International Journal of Occupational Safety and Ergonomic, 9(4), 539--551.

Cited By

View all
  • (2023)The first steps for adapting an artificial intelligence emotion expression recognition software for emotional management in the educational contextBritish Journal of Educational Technology10.1111/bjet.1332654:6(1939-1963)Online publication date: 25-Apr-2023
  • (2022)Secrets revealed by boredom: Detecting and tackling barriers to student engagement2022 International Conference on Advanced Learning Technologies (ICALT)10.1109/ICALT55010.2022.00129(417-419)Online publication date: Jul-2022

Index Terms

  1. A comparison of students' emotional self-reports with automated facial emotion recognition in a reading situation

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        TEEM'18: Proceedings of the Sixth International Conference on Technological Ecosystems for Enhancing Multiculturality
        October 2018
        1072 pages
        ISBN:9781450365185
        DOI:10.1145/3284179
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        In-Cooperation

        • University of Salamanca: University of Salamanca

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 24 October 2018

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. FaceReader
        2. adaptive learning
        3. emotional self-reports
        4. epistemic emotions
        5. facial emotion recognition

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        TEEM'18

        Acceptance Rates

        TEEM'18 Paper Acceptance Rate 151 of 243 submissions, 62%;
        Overall Acceptance Rate 496 of 705 submissions, 70%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)32
        • Downloads (Last 6 weeks)1
        Reflects downloads up to 02 Sep 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2023)The first steps for adapting an artificial intelligence emotion expression recognition software for emotional management in the educational contextBritish Journal of Educational Technology10.1111/bjet.1332654:6(1939-1963)Online publication date: 25-Apr-2023
        • (2022)Secrets revealed by boredom: Detecting and tackling barriers to student engagement2022 International Conference on Advanced Learning Technologies (ICALT)10.1109/ICALT55010.2022.00129(417-419)Online publication date: Jul-2022

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media