Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3335082.3335103acmotherconferencesArticle/Chapter ViewAbstractPublication PagesecceConference Proceedingsconference-collections
short-paper

Toward emotional recognition during HCI using marker-based automated video tracking

Published: 10 September 2019 Publication History

Abstract

Postural movement of a seated person, as determined by lateral aspect video analysis, can be used to estimate learning-relevant emotions. In this article the motion of a person interacting with a computer is automatically extracted from a video by detecting the position of motion-tracking markers on the person’s body. The detection is done by detecting candidate areas for marker with a Convolutional Neural Network and the correct candidate areas are found by template matching. Several markers are detected in more than 99 % of the video frames while one is detected in only ≈ 80,2 % of the frames. The template matching can also detect the correct template in ≈ 80 of the frames. This means that almost always when the correct candidates are extracted, the template matching is successful. Suggestions for how the performance can be improved are given along with possible use of the marker positions for estimating sagittal plane motion.

References

[1]
Ivon Arroyo, Beverly P. Woolf, Winslow Burelson, Kasia Muldner, Dovan Rai, and Minghui Tai. 2014. A multimedia adaptive tutoring system for mathematics that addresses cognition, metacognition and affect. International Journal of Artificial Intelligence in Education 24, 4(2014), 387–426.
[2]
Joe D. Chalkley, Thomas T. Ranji, Carina E. I. Westling, Nachiappan Chockalingam, and Harry J. Witchel. 2017. Wearable Sensor Metric for Fidgeting: Screen Engagement Rather Than Interest Causes NIMI of Wrists and Ankles. In Proceedings of the European Conference on Cognitive Ergonomics 2017(ECCE 2017). 158–161.
[3]
Sidney D’Mello, Patrick Chipman, and Art Graesser. 2007. Posture as a predictor of learner′s affective engagement. In Proceedings of the 29th Annual Cognitive Science Society(Cognitive Science Society), Vol. 1. 905–910.
[4]
Sidney D’Mello, Rick Dale, and Art Graesser. 2012. Disequilibrium in the mind, disharmony in the body. Cognition and Emotion 26, 2 (2012), 362–374.
[5]
Sidney D’Mello, Ed Dieterle, and Angela Duckworth. 2017. Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning. Educational Psychologist 52, 2 (2017), 104–123.
[6]
Ross Girshick. 2015. Fast R-CNN. In The IEEE International Conference on Computer Vision (ICCV).
[7]
Joseph F. Grafsgaard, Joseph B. Wiggins, Kristy Elizabeth Boyer, Eric N. Wiebe, and James C. Lester. 2013. Embodied Affect in Tutorial Dialogue: Student Gesture and Posture. In Artificial Intelligence in Education. 1–10.
[8]
Hatice Gunes, Caifeng Shan, Shizhi Chen, and YingLi Tian. 2015. Bodily Expression for Automatic Affect Recognition. John Wiley & Sons, Inc., 343–377.
[9]
Ioannis A. Kakadiaris, H.A.E. Abdelmunim, W. Yang, and Theoharis Theoharis. 2008. Profile-based face recognition. In 2008 8th IEEE International Conference on Automatic Face Gesture Recognition. 1–8.
[10]
Markus Kiefer and Natalie M. Trumpp. 2012. Embodiment theory and education: The foundations of cognition in perception and action. Trends in Neuroscience and Education 1, 1 (2012), 15–20.
[11]
Bo Li, Haibo Li, and Ulrik Söderström. 2014. Scale-invariant corner keypoints. In 2014 IEEE International Conference on Image Processing (ICIP). 5741–5745.
[12]
Caroline Mega, Lucia Ronconi, and Rossana De Beni. 2014. What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. Journal of Educational Psychology 106, 1 (2014), 121–131.
[13]
Maja Pantic and Ioannis Patras. 2006. Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 36, 2 (April 2006), 433–449.
[14]
Reinhard Pekrun. 2006. The Control-Value Theory of Achievement Emotions: Assumptions, Corollaries, and Implications for Educational Research and Practice. Educational Psychology Review 18, 4 (Dec 2006), 315–341.
[15]
Alexandra Pfister, Alexandre M. West, Shaw Bronner, and Jack A. Noah. 2014. Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis. Journal of Medical Engineering & Technology 38, 5(2014), 274–280.
[16]
Anne Schmitz, Mao Ye, Robert Shapiro, Ruigang Yang, and Brian Noehren. 2014. Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system. Journal of Biomechanics 47, 2 (2014), 587–591.
[17]
Paul C. Seli, Jonathan S. A. Thomson, David R. Cheyne, James A. Martens, Kaylena A. E., and Daniel Smilek. 2014. Restless mind, restless body. Journal of Experimental Psychology: Learning, Memory, and Cognition 40, 3(2014), 660–668.
[18]
Dong P. Tian. 2013. A Review on Image Feature Extraction and Representation Technique. International Journal of Multimedia and Ubiquitous Engineering 8, 4 (July 2013), 385–395.
[19]
A. Viterbi. 1967. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory 13, 2 (April 1967), 260–269.
[20]
Harry J. Witchel, Carlos P. Santos, James K. Ackah, Carina E. I. Westling, and Nachiappan Chockalingam. 2016. Non-Instrumental Movement Inhibition (NIMI) Differentially Suppresses Head and Thigh Movements during Screenic Engagement: Dependence on Interaction. Frontiers in Psychology 7 (2016), 157.
[21]
Harry J. Witchel, Carina Westling, Aoife Healy, Nachiappan Chockalingam, and Rob Needham. 2012. Comparing Four Technologies for Measuring Postural Micromovements During Monitor Engagement. In Proceedings of the 30th European Conference on Cognitive Ergonomics(ECCE ’12). 189–192.
[22]
Harry J. Witchel, Carina E I Westling, Julian Tee, Aoife Healy, Robert Needham, and Nachiappan Chockalingam. 2014. What does not happen: quantifying embodied engagement using NIMI and self-adaptors. Participations 11, 1 (2014), 304–331.

Cited By

View all
  • (2023)Development and Validation of an iPad-based Serious Game for Emotion Recognition and Attention Tracking towards Early Identification of Autism2023 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)10.1109/ACIIW59127.2023.10388145(1-8)Online publication date: 10-Sep-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ECCE '19: Proceedings of the 31st European Conference on Cognitive Ergonomics
September 2019
231 pages
ISBN:9781450371667
DOI:10.1145/3335082
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 September 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Engagement
  2. Fast R-CNN
  3. Motion Tracking
  4. NIMI
  5. Non-Instrumental Movement Inhibition
  6. Template matching
  7. Video analysis
  8. Viterbi algorithm

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • The Brighton and Sussex Medical School Independent Research Project programme

Conference

ECCE 2019
ECCE 2019: 31st European Conference on Cognitive Ergonomics
September 10 - 13, 2019
BELFAST, United Kingdom

Acceptance Rates

Overall Acceptance Rate 56 of 91 submissions, 62%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Development and Validation of an iPad-based Serious Game for Emotion Recognition and Attention Tracking towards Early Identification of Autism2023 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)10.1109/ACIIW59127.2023.10388145(1-8)Online publication date: 10-Sep-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media