Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Functional connectivity-enhanced feature-grouped attention network for cross-subject EEG emotion recognition

Published: 04 March 2024 Publication History

Abstract

Electroencephalogram (EEG)-based automatic emotion recognition technologies are gaining significant attention and have become crucial in the field of brain–computer interfaces (BCIs). In particular, deep learning methods have been widely used in emotion recognition in recent years. However, most existing methods generally tend to focus on EEG spatiotemporal information, which ignores the potential relationships between brain activity signals and the differences in functional connectivity under different emotions. Here, we raise a functional connectivity-enhanced feature-grouped attention network (FC-FAN) for cross-subject emotion recognition. The FC-FAN model developed is a dual-input model. One input consists of the differential entropy data derived from the original EEG signals, while the other input comprises the functional connectivity data obtained through the calculation of the phase synchronization index. Then the primary EEG features of the two groups’ input data are extracted through two specific residual blocks. Next, the designed time-series feature grouped attention module (TFGAM) and functional connectivity feature grouped attention model (F 2GAM) are utilized to mark interested information or suppress uninterested features for the two groups’ features, respectively. Finally, generated information interacts through a fusion operator. The designed framework could not only sufficiently learn the spatiotemporal features of EEG signals but also clearly analyze nonlinear correlations between electrode signals. Comprehensive tests confirm that the FC-FAN has an excellent effect on subject-independent emotion recognition tasks.

References

[1]
Britton J.C., Phan K.L., Taylor S.F., Welsh R.C., Berridge K.C., Liberzon I., Neural correlates of social and nonsocial emotions: An fMRI study, Neuroimage 31 (1) (2006) 397–409,.
[2]
Hernandez-Matamoros A., Bonarini A., Escamilla-Hernandez E., Nakano-Miyatake M., Perez-Meana H., Facial expression recognition with automatic segmentation of face regions using a fuzzy based classification approach, Knowl.-Based Syst. 110 (2016) 1–14,.
[3]
Singh P.a., Srivastava R., Rana K., Kumar V., A multimodal hierarchical approach to speech emotion recognition from audio and text, Knowl.-Based Syst. 229 (2021),.
[4]
Xu G., Guo W., Wang Y., Subject-independent EEG emotion recognition with hybrid spatio-temporal GRU-Conv architecture, Med. Biol. Eng. Comput. 61 (1) (2023) 61–73,.
[5]
Du G., Su J., Zhang L., Su K., Wang X., Teng S., Liu P.X., A multi-dimensional graph convolution network for EEG emotion recognition, IEEE Trans. Instrum. Meas. 71 (2022) 1–11,.
[6]
Suykens J.A., Vandewalle J., Least squares support vector machine classifiers, Neural Process. Lett. 9 (1999) 293–300,.
[7]
Zheng W.-L., Lu B.-L., Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks, IEEE Trans. Auton. Ment. Dev. 7 (3) (2015) 162–175,.
[8]
Chung S.Y., Yoon H.J., Affective classification using Bayesian classifier and supervised learning, in: 2012 12th International Conference on Control, Automation and Systems, IEEE, 2012, pp. 1768–1771. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6393130.
[9]
Liang Z., Zhou R., Zhang L., Li L., Huang G., Zhang Z., Ishii S., EEGFuseNet: Hybrid unsupervised deep feature characterization and fusion for high-dimensional EEG with an application to emotion recognition, IEEE Trans. Neural Syst. Rehabil. Eng. 29 (2021) 1913–1925,.
[10]
Alhagry S., Fahmy A.A., El-Khoribi R.A., Emotion recognition based on EEG using LSTM recurrent neural network, Emotion 8 (10) (2017) 355–358,.
[11]
Song T., Zheng W., Song P., Cui Z., EEG emotion recognition using dynamical graph convolutional neural networks, IEEE Trans. Affect. Comput. 11 (3) (2020) 532–541,.
[12]
Lin Y.-P., Wang C.-H., Jung T.-P., Wu T.-L., Jeng S.-K., Duann J.-R., Chen J.-H., EEG-based emotion recognition in music listening, IEEE Trans. Biomed. Eng. 57 (7) (2010) 1798–1806,.
[13]
Stikic M., Johnson R.R., Tan V., Berka C., EEG-based classification of positive and negative affective states, Brain-Comput. Interfaces 1 (2) (2014) 99–112,.
[14]
Jenke R., Peer A., Buss M., Feature extraction and selection for emotion recognition from EEG, IEEE Trans. Affect. Comput. 5 (3) (2014) 327–339,.
[15]
Soroush M.Z., Maghooli K., Setarehdan S.K., Nasrabadi A.M., Emotion recognition through EEG phase space dynamics and Dempster-Shafer theory, Med. Hypotheses 127 (2019) 34–45,.
[16]
Lan Z., Sourina O., Wang L., Scherer R., Müller-Putz G., Unsupervised feature learning for EEG-based emotion recognition, in: Proc. of 2017 Int. Conf. Cyberworlds (CW), IEEE, 2017, pp. 182–185,.
[17]
Yang Q., Zhou J., Cheng C., Wei X., Chu S., An emotion recognition method based on selective gated recurrent unit, in: Proc. of 2018 IEEE Int. Conf. Prog. Inf. Comput.(PIC), IEEE, 2018, pp. 33–37,.
[18]
Chen J.X., Jiang D.M., Zhang Y.N., A hierarchical bidirectional GRU model with attention for EEG-based emotion classification, IEEE Access 7 (2019) 118530–118540,.
[19]
Pandey P., Seeja K., Subject independent emotion recognition from EEG using VMD and deep learning, J. King Saud Univ.-Comput. Inf. Sci. (2019),.
[20]
Chao H., Dong L., Liu Y., Lu B., Emotion recognition from multiband EEG signals using CapsNet, Sensors 19 (9) (2019),.
[21]
Jana G.C., Sabath A., Agrawal A., Capsule neural networks on spatio-temporal EEG frames for cross-subject emotion recognition, Biomed. Signal Process. Control 72 (2022),.
[22]
Huang D., Chen S., Liu C., Zheng L., Tian Z., Jiang D., Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition, Neurocomputing 448 (2021) 140–151,.
[23]
Anders S., Lotze M., Erb M., Grodd W., Birbaumer N., Brain activity underlying emotional valence and arousal: A response-related fMRI study, Hum. Brain Mapp. 23 (4) (2004) 200–209,.
[24]
Song T., Zheng W., Liu S., Zong Y., Cui Z., Li Y., Graph-embedded convolutional neural network for image-based EEG emotion recognition, IEEE Trans. Emerg. Top. Comput. (2021) 1–14,.
[25]
Zhong P., Wang D., Miao C., EEG-based emotion recognition using regularized graph neural networks, IEEE Trans. Affect. Comput. (2020) 1–12,.
[26]
Li Y., Fu B., Li F., Shi G., Zheng W., A novel transferability attention neural network model for EEG emotion recognition, Neurocomputing 447 (2021) 92–101,.
[27]
Li C., Zhang Z., Zhang X., Huang G., Liu Y., Chen X., EEG-based emotion recognition via transformer neural architecture search, IEEE Trans. Ind. Inform. (2022) 1–10,.
[28]
Guo W., Xu G., Wang Y., Horizontal and vertical features fusion network based on different brain regions for emotion recognition, Knowl.-Based Syst. 247 (2022),.
[29]
Wang Z., Wang Y., Hu C., Yin Z., Song Y., Transformers for EEG-based emotion recognition: A hierarchical spatial information learning model, IEEE Sens. J. 22 (5) (2022) 4359–4368,.
[30]
Li Y., Chen J., Li F., Fu B., Wu H., Ji Y., Zhou Y., Niu Y., Shi G., Zheng W., GMSS: Graph-based multi-task self-supervised learning for EEG emotion recognition, IEEE Trans. Affect. Comput. (2022) 1–15,.
[31]
Li J., Hua H., Xu Z., Shu L., Xu X., Kuang F., Wu S., Cross-subject EEG emotion recognition combined with connectivity features and meta-transfer learning, Comput. Biol. Med. 145 (2022),.
[32]
Sun B., Saenko K., Deep coral: Correlation alignment for deep domain adaptation, in: Proc. of Eur. Conf. Comput. Vis., Springer, 2016, pp. 443–450,.
[33]
Tzeng E., Hoffman J., Zhang N., Saenko K., Darrell T., Deep domain confusion: Maximizing for domain invariance, 2014, pp. 1–9. arXiv preprint arXiv:1412.3474. arxiv.org/pdf/1412.3474.
[34]
Long M., Cao Y., Cao Z., Wang J., Jordan M.I., Transferable representation learning with deep adaptation networks, IEEE Trans. Pattern Anal. Mach. Intell. 41 (12) (2019) 3071–3085,.
[35]
Gong B., Shi Y., Sha F., Grauman K., Geodesic flow kernel for unsupervised domain adaptation, in: Proc. of IEEE Conf. Comput. Vis. Pattern Recogn., IEEE, 2012, pp. 2066–2073,.
[36]
Lan Z., Sourina O., Wang L., Scherer R., Müller-Putz G.R., Domain adaptation techniques for EEG-based emotion recognition: a comparative study on two public datasets, IEEE Trans. Cogn. Dev. Syst. 11 (1) (2018) 85–94,.
[37]
Gao Z., Li Y., Yang Y., Dong N., Yang X., Grebogi C., A coincidence-filtering-based approach for CNNs in EEG-based recognition, IEEE Trans. Ind. Inform. 16 (11) (2020) 7159–7167,.
[38]
Li H., Jin Y.-M., Zheng W.-L., Lu B.-L., Cross-subject emotion recognition using deep adaptation networks, in: Proc. Int. Conf. Neural Inform. Process., Springer, 2018, pp. 403–413,.
[39]
Li J., Qiu S., Shen Y.-Y., Liu C.-L., He H., Multisource transfer learning for cross-subject EEG emotion recognition, IEEE Trans. Cybern. 50 (7) (2020) 3281–3293,.
[40]
Zhang W., Wang F., Jiang Y., Xu Z., Wu S., Zhang Y., Cross-subject EEG-based emotion recognition with deep domain confusion, in: Proc. of Int. Conf. Intel. Robot. Appl., Springer, 2019, pp. 558–570,.
[41]
Wang F., Zhang W., Xu Z., Ping J., Chu H., A deep multi-source adaptation transfer network for cross-subject electroencephalogram emotion recognition, Neural Comput. Appl. (2021) 1–13,.
[42]
Chen H., Jin M., Li Z., Fan C., Li J., He H., MS-MDA: Multisource marginal distribution adaptation for cross-subject and cross-session EEG emotion recognition, Front. Neurosci. 15 (2021),.
[43]
Zhou R., Zhang Z., Yang X., Fu H., Zhang L., Li L., Huang G., Dong Y., Li F., Liang Z., A novel transfer learning framework with prototypical representation based pairwise learning for cross-subject cross-session EEG-based emotion recognition, 2022, arXiv preprint arXiv:2202.06509.
[44]
Cao J., He X., Yang C., Chen S., Li Z., Wang Z., Multi-source and multi-representation adaptation for cross-domain electroencephalography emotion recognition, Front. Psychol. 12 (2022) 6516,.
[45]
Li Z., Zhu E., Jin M., Fan C., He H., Cai T., Li J., Dynamic domain adaptation for class-aware cross-subject and cross-session EEG emotion recognition, IEEE J. Biomed. Health Inform. 26 (12) (2022) 5964–5973,.
[46]
Sun J., Hong X., Tong S., Phase synchronization analysis of EEG signals: an evaluation based on surrogate tests, IEEE Trans. Biomed. Eng. 59 (8) (2012) 2254–2263,.
[47]
Brunner C., Scherer R., Graimann B., Supp G., Pfurtscheller G., Online control of a brain-computer interface using phase synchronization, IEEE Trans. Biomed. Eng. 53 (12) (2006) 2501–2506,.
[48]
Zheng W., Liu W., Lu Y., Lu B., Cichocki A., EmotionMeter: A multimodal framework for recognizing human emotions, IEEE Trans. Cybern. 49 (3) (2018) 1110–1122,.
[49]
Koelstra S., Muhl C., Soleymani M., Lee J.-S., Yazdani A., Ebrahimi T., Pun T., Nijholt A., Patras I., Deap: A database for emotion analysis; using physiological signals, IEEE Trans. Affect. Comput. 3 (1) (2012) 18–31,.
[50]
D.P. Kingma, J. Ba, Adam: A method for stochastic optimization, in: Proc. of 3rd Int. Conf. Learn. Represent., San Diego, USA, 2015,.
[51]
Zhang T., Zheng W., Cui Z., Zong Y., Li Y., Spatial-temporal recurrent neural network for emotion recognition, IEEE Trans. Cybern. 49 (3) (2019) 839–847,.
[52]
Li Y., Zheng W., Zong Y., Cui Z., Zhang T., Zhou X., A bi-hemisphere domain adversarial neural network model for EEG emotion recognition, IEEE Trans. Affect. Comput. 12 (2) (2021) 494–504,.
[53]
Li Y., Wang L., Zheng W., Zong Y., Qi L., Cui Z., Zhang T., Song T., A novel bi-hemispheric discrepancy model for EEG emotion recognition, IEEE Trans. Cogn. Dev. Syst. 13 (2) (2021) 354–367,.
[54]
Zhang W., Yin Z., EEG feature selection for emotion recognition based on cross-subject recursive feature elimination, in: Proc. of Chin. Control Conf. (CCC), IEEE, 2020, pp. 6256–6261,.
[55]
Pandey P., Seeja K., Subject independent emotion recognition system for people with facial deformity: an EEG based approach, J. Ambient Intell. Humaniz. Comput. 12 (2) (2021) 2311–2320,.
[56]
Rajpoot A.S., Panicker M.R., et al., Subject independent emotion recognition using EEG signals employing attention driven neural networks, Biomed. Signal Process. Control 75 (2022),.

Index Terms

  1. Functional connectivity-enhanced feature-grouped attention network for cross-subject EEG emotion recognition
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image Knowledge-Based Systems
          Knowledge-Based Systems  Volume 283, Issue C
          Jan 2024
          948 pages

          Publisher

          Elsevier Science Publishers B. V.

          Netherlands

          Publication History

          Published: 04 March 2024

          Author Tags

          1. Deep learning
          2. EEG emotion recognition
          3. Functional connectivity features
          4. Spatiotemporal information

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 0
            Total Downloads
          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 09 Nov 2024

          Other Metrics

          Citations

          View Options

          View options

          Get Access

          Login options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media