Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3637732.3637734acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicbbeConference Proceedingsconference-collections
research-article

Fusion Network Modeling for Cross-Time Emotion Recognition from EEG

Published: 28 February 2024 Publication History

Abstract

In recent years, emotion recognition technology has played an important role in various industries. Most of the publicly available cross-temporal EEG model focus on extracting temporal or spatial features for emotion classification, while neglecting the integration of the brain’s overall structure with multidimensional information. Thus, methods based on manual feature extraction perform poorly in cross-temporal EEG emotion recognition. To address these issues, this paper proposes a novel cross-temporal EEG emotion recognition model. The model consists of a graph convolutional network and a convolutional neural network with an attention mechanism. The graph convolutional network naturally fits the brain’s structure and provides both temporal and spatial information based on channel positions. The convolutional neural network with attention mechanism explores important and discriminative features on top of enhancing the representational capacity of deep networks. Five experiments were conducted on 24 subjects, with time intervals of 1 day, 3 days, 7 days, and 14 days, resulting in a dataset of 720 cases of EEG data encompassing three emotions. The data from the initial three experiments constituted the training set, while the data from the final two experiments served as the test set. By integrating multiple networks into a unified architecture, we demonstrate enhanced classification performance. Our method achieved an average classification accuracy of 62.79%.in the datasets used in this study. The experimental results indicate that the fused model can comprehensively and flexibly learn and process EEG signal data, providing deep and shallow features required for classification.

References

[1]
[1] K. H. Kim, S. W. Bang, and S. R. Kim, “Emotion recognition system using short-term monitoring of physiological signals,” Medical and biological engineering and computing, vol. 42, pp. 419–427, 2004.
[2]
[2] A. Moin, F. Aadil, Z. Ali, and D. Kang, “Emotion recognition framework using multiple modalities for an effective human–computer interaction,” The Journal of Supercomputing, vol. 79, no. 8, pp. 9320–9349, 2023.
[3]
[3] F. Benrouba and R. Boudour, “Emotional sentiment analysis of social media content for mental health safety,” Social Network Analysis and Mining, vol. 13, no. 1, p. 17, 2023.
[4]
[4] Z. Lu, T. Wang, and R. Zhang, “Affective brain-computer interface in emotion artificial intelligence and medical engineering,” Frontiers in Computational Neuroscience, vol. 17, p. 1237252, 2023.
[5]
[5] R. W. Picard, Affective computing.   MIT press, 2000.
[6]
[6] R. Arya, J. Singh, and A. Kumar, “A survey of multidisciplinary domains contributing to affective computing,” Computer Science Review, vol. 40, p. 100399, 2021.
[7]
[7] S. Brave and C. Nass, “Emotion in human-computer interaction,” in The human-computer interaction handbook.   CRC Press, 2007, pp. 103–118.
[8]
[8] P. J. Bota, C. Wang, A. L. Fred, and H. P. Da Silva, “A review, current challenges, and future possibilities on emotion recognition using machine learning and physiological signals,” IEEE Access, vol. 7, pp. 140 990–141 020, 2019.
[9]
[9] A. Mikuckas, I. Mikuckiene, A. Venckauskas, E. Kazanavicius, R. Lukas, and I. Plauska, “Emotion recognition in human computer interaction systems,” Elektronika ir Elektrotechnika, vol. 20, no. 10, pp. 51–56, 2014.
[10]
[10] D. O. Bos et al., “Eeg-based emotion recognition,” The influence of visual and auditory stimuli, vol. 56, no. 3, pp. 1–17, 2006.
[11]
[11] J. E. LeDoux, “Emotion circuits in the brain,” Annual review of neuroscience, vol. 23, no. 1, pp. 155–184, 2000.
[12]
[12] Y.-W. Shen and Y.-P. Lin, “Challenge for affective brain-computer interfaces: Non-stationary spatio-spectral eeg oscillations of emotional responses,” Frontiers in human neuroscience, vol. 13, p. 366, 2019.
[13]
[13] J. Li, Z. Zhang, and H. He, “Hierarchical convolutional neural networks for eeg-based emotion recognition,” Cognitive Computation, vol. 10, pp. 368–380, 2018.
[14]
[14] S. Hwang, K. Hong, G. Son, and H. Byun, “Learning cnn features from de features for eeg-based emotion recognition,” Pattern Analysis and Applications, vol. 23, pp. 1323–1335, 2020.
[15]
[15] Z. Jia, Y. Lin, X. Cai, H. Chen, H. Gou, and J. Wang, “Sst-emotionnet: Spatial-spectral-temporal based attention 3d dense network for eeg emotion recognition,” in Proceedings of the 28th ACM international conference on multimedia, 2020, pp. 2909–2917.
[16]
[16] Y. Li, B. Fu, F. Li, G. Shi, and W. Zheng, “A novel transferability attention neural network model for eeg emotion recognition,” Neurocomputing, vol. 447, pp. 92–101, 2021.
[17]
[17] T. Song, W. Zheng, P. Song, and Z. Cui, “Eeg emotion recognition using dynamical graph convolutional neural networks,” IEEE Transactions on Affective Computing, vol. 11, no. 3, pp. 532–541, 2018.
[18]
[18] P. Zhong, D. Wang, and C. Miao, “Eeg-based emotion recognition using regularized graph neural networks,” IEEE Transactions on Affective Computing, vol. 13, no. 3, pp. 1290–1301, 2020.
[19]
[19] V. Mnih, N. Heess, A. Graves et al., “Recurrent models of visual attention,” Advances in neural information processing systems, vol. 27, 2014.
[20]
[20] J. Hu, L. Shen, and G. Sun, “Squeeze-and-excitation networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 7132–7141.
[21]
[21] G. Xiao, M. Shi, M. Ye, B. Xu, Z. Chen, and Q. Ren, “4d attention-based neural network for eeg emotion recognition,” Cognitive Neurodynamics, pp. 1–14, 2022.
[22]
[22] L.-C. Shi, Y.-Y. Jiao, and B.-L. Lu, “Differential entropy feature for eeg-based vigilance estimation,” in 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).   IEEE, 2013, pp. 6627–6630.
[23]
[23] J. Faskowitz, F. Z. Esfahlani, Y. Jo, O. Sporns, and R. F. Betzel, “Edge-centric functional network representations of human cerebral cortex reveal overlapping system-level architecture,” Nature neuroscience, vol. 23, no. 12, pp. 1644–1654, 2020.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICBBE '23: Proceedings of the 2023 10th International Conference on Biomedical and Bioinformatics Engineering
November 2023
295 pages
ISBN:9798400708343
DOI:10.1145/3637732
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 February 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. EEG
  2. affective computing
  3. deep learning
  4. emotion recognition

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICBBE 2023

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 36
    Total Downloads
  • Downloads (Last 12 months)36
  • Downloads (Last 6 weeks)8
Reflects downloads up to 24 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media