Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3502871.3502895acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicbbeConference Proceedingsconference-collections
research-article

Improving The Classification Accuracy of Negative Emotions Based on Multiple Physiological Signals Using A Novel Information Fusion Method FWBP

Published: 14 March 2022 Publication History

Abstract

Emotion recognition is crucial in the field of medical health and human-computer interaction. In recent years, using electroencephalogram (EEG) signals to identify emotions has received extensive attention and gets relatively satisfying results, while it is still a problem that the ability to identify negative emotions was limited which plays an important role in the practical application. In addition, negative emotions were rarely taken out as a major emotional state to study. Therefore, in this paper, 4 peripheral nervous system signals were collected to assist the EEG signals to improve the negative emotion recognition rate and, unlike the previous studies based on one information fusion method alone, proposed a novel information fusion method, termed as FWBP, which consists of a weighted decision fusion, a feature fusion and a BP network to achieve the information fusion of 5 physiological signals. To do this, the positive, neutral and negative emotions of 12 subjects were evoked by emotional videos, and EEG, electrodermal activity (EDA), electrocardiogram (ECG), photoplethysmogram (PPG) and respiration (RSP) were recorded simultaneously. We constructed a information fusion method FWBP. This method improved the mean accuracy by about 6% compared to central nervous system model (CNSM) which used EEG signals alone. More importantly, compared with the CNSM, the negative emotions recognition rate in this method was increased by about 10%. Furthermore, this method also proved that the identification results from multiple information fusion methods are input into BP neural network for the second fusion and classification is able to achieve complementary advantages and exhibit a better performance in emotion recognition.

References

[1]
Yoo Jaewook, Kwon Jaerock and Choe Yoonsuck. 2014. Predictable internal brain dynamics in EEG and its relation to conscious states. Frontiers In Neurorobotics. 8, (June 2014), 1-7. https://doi.org/10.3389/fnbot.2014.00018
[2]
Gouizi, Bereksi Reguig and Maaoui. 2011. Emotion recognition from physiological signals. Journal of Medical Engineering & Technology. 35, 6-7(2011). 300-307. https://doi.org/10.3109/03091902.2011.601784
[3]
Ekman P. 1993.  Facial expression and emotion.  The American psychologist. 48, 4(April 1993). 384-392.
[4]
Kim Jonghwa and André Elisabeth. 2008. Emotion recognition based on physiological changes in music listening.  IEEE transactions on pattern analysis and machine intelligence. 30, 12(December 2008). 2067-2083. https://doi.org/10.1109/TPAMI.2008.26
[5]
Song Tengfei and Zheng Wenming. 2020. EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks. IEEE Transactions on Affective Computing. 11, 3(2020). 532-541. https://doi.org/10.1109/TAFFC.2018.2817622
[6]
Xing Xiaofen and Li Zhenqi. 2019. SAE plus LSTM: A New Framework for Emotion Recognition From Multi-Channel EEG. Frontires in Neurorobotics. 13, (June 2019). https://doi.org/10.3389/fnbot.2019.00037
[7]
Qing Chunmei and Qiao Rui. 2019. Interpretable Emotion Recognition Using EEG Signals. IEEE Acess. 7,(2019). 94160-94170. https://doi.org/ 10.1109/ACCESS.2019.2928691
[8]
Lian Zhengxu and Guo Yingjie. An Ear Wearable Device System for Facial Emotion Recognition Disorders. Frontiers in Bioengineering and Biotechnology. 9, (June 2021). https://doi.org/10.3389/fbioe.2021.703048
[9]
Ayata Deger and Yaslan Yusuf. 2020. Emotion Recognition from Multimodal Physiological Signals for Emotion Aware Healthcare Systems. Journal of Medical and Biological Engineering. 40, 2(April 2020). 149-157. https://doi.org/10.1007/s40846-019-00505-7
[10]
Hassan MM and Alam MGR. 2019. Human emotion recognition using deep belief network architecture. Information Fusion. 51,(November 2019). https://doi.org/10-18.10.1016/j.inffus.2018.10.009
[11]
Huang Haiping and Hu Zhenchao. 2020. Multimodal Emotion Recognition Based on Ensemble Convolutional Neural Network. IEEE Acess. 8,(2020). 3265-3271. https://doi.org/10.1109/ACCESS.2019.2962085
[12]
Zhang J and Yin Z. 2020.Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Information Fusion. 59,(July 2020). https://doi.org/103-126. 10.1016/j.inffus.2020.01.011
[13]
Koelstra Sander and Muhl Christian. 2012. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE Transactions on Affective Computing. 3,1(2012). 18-31. https://doi.org/10.1109/T-AFFC.2011.15
[14]
Verma Gyanendra K and Tiwary Uma Shanker. 2017. Affect representation and recognition in 3D continuous valence-arousal-dominance space. Multimedia Tools and applications. 76, 2(June 2017). 2159-2183. https://doi.org/10.1007/s11042-015-3119-y
[15]
Zheng W. 2015. Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks. IEEE Transactions on Autonomous Mental Development. 7,3(September 2015). 162-175. https://doi.org/10.1109/TAMD.2015.2431497

Cited By

View all
  • (2024)Exploring Central-Peripheral Nervous System Interaction Through Multimodal Biosignals: A Systematic ReviewIEEE Access10.1109/ACCESS.2024.339403612(60347-60368)Online publication date: 2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICBBE '21: Proceedings of the 2021 8th International Conference on Biomedical and Bioinformatics Engineering
November 2021
216 pages
ISBN:9781450385077
DOI:10.1145/3502871
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Emotion recognition
  2. Information fusion
  3. Negative emotions
  4. Physiological signals

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICBBE '21

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Exploring Central-Peripheral Nervous System Interaction Through Multimodal Biosignals: A Systematic ReviewIEEE Access10.1109/ACCESS.2024.339403612(60347-60368)Online publication date: 2024

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media