Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
Wearable and automotive systems for affect recognition from physiology
Publisher:
  • Massachusetts Institute of Technology
  • 201 Vassar Street, W59-200 Cambridge, MA
  • United States
Order Number:AAI0801928
Pages:
1
Reflects downloads up to 02 Sep 2024Bibliometrics
Skip Abstract Section
Abstract

Novel systems and algorithms have been designed and built to recognize affective patterns in physiological signals. Experiments were conducted for evaluation of the new systems and algorithms in three types of settings: a highly constrained laboratory setting, a largely unconstrained ambulatory environment, and a less unconstrained automotive environment. The laboratory experiment was designed to test for the presence of unique physiological patterns given a relatively motionless seated subject, intentionally expressing one of eight different emotions. This experiment generated a large dataset of physiological signals containing many day-today variations, and the proposed features contributed to a success rate of 81% for discriminating all eight emotions and up to 100% for subsets of emotion based on similar emotion qualities. For experiments in a largely unconstrained ambulatory setting, new wearable computer systems and sensors were developed and tested on subjects who walked, jogged, talked, and otherwise went about daily activities. Although physical motion often overwhelmed affective signals in this context, the systems developed in this thesis are currently useful as activity monitors, providing an image diary correlated with physiological signals. A third experiment was conducted in the natural but physically constrained environment of an automobile, generating a large database of physiological signals covering over 40 hours of driving data. This experiment was designed to induce varying amounts of stress in three conditions: rest, city diving and highway driving. Algorithms for detecting driver stress achieved a recognition rates of 96% for stress ratings based on task conditions and 89% accuracy using ratings of perceived stress from driver questionnaires. The results of second by second video coding of stressors by independent coders showed highly significant correlations with physiological features (up to r = .77 for over 4000 samples). Together, these three experiments show a range of success in recognizing affect from physiology, highlighting the need for more automatic context sensing in unconstrained conditions, and showing high recognition rates given somewhat constrained conditions. The recognition rates obtained thus far lend support to the hypothesis that many emotional differences can be automatically discriminated in patterns of physiological changes. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

Cited By

  1. Chaptoukaev H, Strizhkova V, Panariello M, D'Alpaos B, Reka A, Manera V, Thümmler S, Ismailova E, Evans N, Bremond F, Todisco M, Zuluaga M and Ferrari L StressID Proceedings of the 37th International Conference on Neural Information Processing Systems, (29798-29811)
  2. ACM
    Adiani D, Itzkovitz A, Bian D, Katz H, Breen M, Hunt S, Swanson A, Vogus T, Wade J and Sarkar N (2022). Career Interview Readiness in Virtual Reality (CIRVR): A Platform for Simulated Interview Training for Autistic Individuals and Their Employers, ACM Transactions on Accessible Computing, 15:1, (1-28), Online publication date: 31-Mar-2022.
  3. Sibi S, Balters S, Fu E, Strack E, Steinert M and Ju W Back to School: Impact of Training on Driver Behavior and State in Autonomous Vehicles 2020 IEEE Intelligent Vehicles Symposium (IV), (1189-1196)
  4. ACM
    Mavridou I, Seiss E, Kostoulas T, Nduka C and Balaguer-Ballester E Towards an effective arousal detection system for virtual reality Proceedings of the Workshop on Human-Habitat for Health (H3): Human-Habitat Multimodal Interaction for Promoting Health and Well-Being in the Internet of Things Era, (1-6)
  5. ACM
    Haouij N, Poggi J, Sevestre-Ghalila S, Ghozi R and Jaïdane M AffectiveROAD system and database to assess driver's attention Proceedings of the 33rd Annual ACM Symposium on Applied Computing, (800-803)
  6. Gupta N, Najeeb D, Gabrielian V and Nahapetian A Mobile ECG-based drowsiness detection 2017 14th IEEE Annual Consumer Communications & Networking Conference (CCNC), (29-32)
  7. ACM
    Cruz L, Rubin J, Abreu R, Ahern S, Eldardiry H and Bobrow D A wearable and mobile intervention delivery system for individuals with panic disorder Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia, (175-182)
  8. ACM
    Heibeck F, Hope A and Legault J Sensory Fiction Proceedings of the 2nd ACM International Workshop on Immersive Media Experiences, (35-40)
  9. ACM
    Jeon M Advanced Vehicle Sonification Applications Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, (1-5)
  10. Ivonin L, Chang H, Chen W and Rauterberg M (2013). Unconscious emotions, Personal and Ubiquitous Computing, 17:4, (663-673), Online publication date: 1-Apr-2013.
  11. Sharma N and Gedeon T (2012). Objective measures, sensors and computational techniques for stress recognition and classification, Computer Methods and Programs in Biomedicine, 108:3, (1287-1301), Online publication date: 1-Dec-2012.
  12. Meftah I, Le Thanh N and Ben Amar C Emotion recognition using KNN classification for user modeling and sharing of affect states Proceedings of the 19th international conference on Neural Information Processing - Volume Part I, (234-242)
  13. ACM
    Yazdani A, Lee J, Vesin J and Ebrahimi T (2012). Affect recognition based on physiological changes during the watching of music videos, ACM Transactions on Interactive Intelligent Systems, 2:1, (1-26), Online publication date: 1-Mar-2012.
  14. ACM
    Zhang L, Tamminedi T, Ganguli A, Yosiphon G and Yadegar J Hierarchical multiple sensor fusion using structurally learned Bayesian network Wireless Health 2010, (174-183)
  15. ACM
    Money A and Agius H (2010). ELVIS, ACM Transactions on Multimedia Computing, Communications, and Applications, 6:3, (1-30), Online publication date: 1-Aug-2010.
  16. Yang G, Lin Y and Bhattacharya P (2010). A driver fatigue recognition model based on information fusion and dynamic Bayesian network, Information Sciences: an International Journal, 180:10, (1942-1954), Online publication date: 1-May-2010.
  17. Taleb T, Bottazzi D and Nasser N (2010). A novel middleware solution to improve ubiquitous healthcare systems aided by affective information, IEEE Transactions on Information Technology in Biomedicine, 14:2, (335-349), Online publication date: 1-Mar-2010.
  18. Ping A and Wang Z Human-computer affective interaction system research and application based on physiological signals Proceedings of the 21st annual international conference on Chinese control and decision conference, (5897-5900)
  19. ACM
    Soleymani M, Chanel G, Kierkels J and Pun T Affective ranking of movie scenes using physiological signals and content analysis Proceedings of the 2nd ACM workshop on Multimedia semantics, (32-39)
  20. ACM
    Lessa J Representation and communication of affective states Proceedings of the 26th annual ACM international conference on Design of communication, (271-272)
  21. Yang G, Lin Y and Bhattacharya P (2008). Multimodality inferring of human cognitive states based on integration of neuro-fuzzy network and information fusion techniques, EURASIP Journal on Advances in Signal Processing, 2008, (8), Online publication date: 1-Jan-2008.
  22. Leng H, Lin Y and Zanzi L An experimental study on physiological parameters toward driver emotion recognition Proceedings of the 2007 international conference on Ergonomics and health aspects of work with computers, (237-246)
  23. Lin T, Imamiya A, Hu W and Omata M Display characteristics affect users' emotional arousal in 3D games Proceedings of the 9th conference on User interfaces for all, (337-351)
  24. Chanel G, Kronegg J, Grandjean D and Pun T Emotion assessment Proceedings of the 2006 international conference on Multimedia Content Representation, Classification and Security, (530-537)
  25. Bee N, Prendinger H, Nakasone A, André E and Ishizuka M AutoSelect Proceedings of the 2006 international tutorial and research conference on Perception and Interactive Technologies, (40-52)
  26. Goo J, Park K, Lee M, Park J, Hahn M, Ahn H and Picard R Effects of guided and unguided style learning on user attention in a virtual environment Proceedings of the First international conference on Technologies for E-Learning and Digital Entertainment, (1208-1222)
  27. Partala T, Surakka V and Vanhala T (2006). Real-time estimation of emotional experiences from facial expressions, Interacting with Computers, 18:2, (208-226), Online publication date: 1-Mar-2006.
  28. Lin T, Omata M, Hu W and Imamiya A Do physiological data relate to traditional usability indexes? Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future, (1-10)
  29. Lisetti C and Nasoz F (2004). Using noninvasive wearable computers to recognize human emotions from physiological signals, EURASIP Journal on Advances in Signal Processing, 2004, (1672-1687), Online publication date: 1-Jan-2004.
  30. Lukowicz P, Junker H, Stäger M, Büren T and Tröster G WearNET Proceedings of the 4th international conference on Ubiquitous Computing, (361-370)
  31. Picard R, Vyzas E and Healey J (2001). Toward Machine Emotional Intelligence, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23:10, (1175-1191), Online publication date: 1-Oct-2001.
Contributors
  • Adobe Inc.

Recommendations