Abstract
Electroencephalograms (EEG)-based technology for recognizing emotions has attracted a lot of interest lately. However, there is still work to be done on the efficient fusion of different temporal and spatial features of EEG signals to improve performance in emotion recognition. Therefore, this study suggests a new deep learning architecture that combines a time–frequency convolutional neural network (TFCNN), a bidirectional gated recurrent unit (BiGRU), and a self-attention mechanism (SAM) to categorize emotions based on EEG signals and automatically extract features. The first step is to use the continuous wavelet transform (CWT), which responds more readily to temporal frequency variations within EEG recordings, as a layer inside the convolutional layers, to create 2D scalogram images from EEG signals for time series and spatial representation learning. Second, to encode more discriminative features representing emotions, two-dimensional (2D)-CNN, BiGRU, and SAM are trained on these scalograms simultaneously to capture the appropriate information from spatial, local, temporal, and global aspects. Finally, EEG signals are categorized into several emotional states. This network can learn the temporal dependencies of EEG emotion signals with BiGRU, extract local spatial features with TFCNN, and improve recognition accuracy with SAM, which is applied to explore global signal correlations by reassigning weights to emotion features. Using the SEED and GAMEEMO datasets, the suggested strategy was evaluated on three different classification tasks: one with two target classes (positive and negative), one with three target classes (positive, neutral, and negative), and one with four target classes (boring, calm, horror, and funny). Based on the comprehensive results of the experiments, the suggested approach achieved a 93.1%, 96.2%, and 92.9% emotion detection accuracy in two, three, and four classes, respectively, which are 0.281%, 1.98%, and 2.57% higher than the existing approaches working on the same datasets for different subjects, respectively. The open source codes are available at https://www.mathworks.com/matlabcentral/fileexchange/165126-tfcnn-bigru
Similar content being viewed by others
Data availability
Data sharing is not applicable to this article as data sets were not generated or analyzed during the current study.
References
Fragopanagos, N., Taylor, J.G.: Emotion recognition in human-computer interaction. Neural Netw. 18(4), 389–405 (2005)
Rached, T.S., Perkusich, A.: Emotion recognition based on brain-computer interface systems. Brain-computer interface systems-Recent progress and future prospects, pp. 253–270, (2013)
Bamidis, P.D., Papadelis, C., Kourtidou-Papadeli, C., Pappas, C., Vivas, A.B.: Affective computing in the era of contemporary neurophysiology and health informatics. Interact. Comput. 16(4), 715–721 (2004)
Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Compu. 42, 419–427 (2004)
Huang, H., Xie, Q., Pan, J., He, Y., Wen, Z., Ronghao, Yu., Li, Y.: An eeg-based brain computer interface for emotion recognition and its application in patients with disorder of consciousness. IEEE Trans. Affect. Comput. 12(4), 832–842 (2019)
Pessoa, L.: A network model of the emotional brain. Trends Cognitive Sci. 21(5), 357–371 (2017)
Li, J., Qiu, S., Shen, Y.-Y., Liu, C.-L., He, H.: Multisource transfer learning for cross-subject eeg emotion recognition. IEEE Trans. Cybern. 50(7), 3281–3293 (2019)
Lan, Z., Sourina, O., Wang, L., Scherer, R., Müller-Putz, G.R.: Domain adaptation techniques for eeg-based emotion recognition: A comparative study on two public datasets. IEEE Trans. Cogn. Dev. Syst. 11(1), 85–94 (2018)
Türk, Ö., Özerdem, M.S.: Epilepsy detection by using scalogram based convolutional neural network from eeg signals. Brain Sci. 9(5), 115 (2019)
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł, Polosukhin, I.: Attention is all you need. Adv. Neural Inform. Process. Syst. 30, 1 (2017)
Jiang, X., Bian, G.-B., Tian, Z.: Removal of artifacts from eeg signals: a review. Sensors 19(5), 987 (2019)
Xiaopei, W., Zhou, B., Lv, Z., Zhang, C.: To explore the potentials of independent component analysis in brain-computer interface of motor imagery. IEEE J. Biomed. Health Inform. 24(3), 775–787 (2019)
Sadiq, M.T., Xiaojun, Yu., Yuan, Z., Aziz, M.Z.: Motor imagery bci classification based on novel two-dimensional modelling in empirical wavelet transform. Electron. Lett. 56(25), 1367–1369 (2020)
Sadiq, M.T., Xiaojun, Yu., Yuan, Z., Aziz, M.Z., ur Rehman, N., Ding, W., Xiao, G.: Motor imagery bci classification based on multivariate variational mode decomposition. IEEE Trans. Emerg. Top. Comput. Intell. 6(5), 1177–1189 (2022)
Sadiq, M.T., Xiaojun, Yu., Yuan, Z., Zeming, F., Rehman, A.U., Ullah, I., Li, G., Xiao, G.: Motor imagery eeg signals decoding by multivariate empirical wavelet transform-based framework for robust brain-computer interfaces. IEEE Access 7, 171431–171451 (2019)
Sadiq, M.T., Akbari, H., Siuly, S., Li, Y., Wen, P.: Alcoholic eeg signals recognition based on phase space dynamic and geometrical features. Chaos Solitons Fractals 158, 112036 (2022)
Sadiq, M.T., Xiaojun, Yu., Yuan, Z., Aziz, M.Z., Siuly, S., Ding, W.: Toward the development of versatile brain-computer interfaces. IEEE Trans. Artif. Intell. 2(4), 314–328 (2021)
Xiaojun, Yu., Aziz, M.Z., Sadiq, M.T., Fan, Z., Xiao, G.: A new framework for automatic detection of motor and mental imagery eeg signals for robust bci systems. IEEE Trans. Instrum. Meas. 70, 1–12 (2021)
Sadiq, M.T., Xiaojun, Yu., Yuan, Z., Fan, Z., Rehman, A.U., Li, G., Xiao, G.: Motor imagery eeg signals classification based on mode amplitude and frequency components using empirical wavelet transform. Ieee Access 7, 127678–127692 (2019)
Sadiq, M.T., Xiaojun, Yu., Yuan, Z.: Exploiting dimensionality reduction and neural network techniques for the development of expert brain-computer interfaces. Expert Syst. Appl. 164, 114031 (2021)
Mane, R., Chew, E., Chua, K., Ang, K.K., Robinson, N., Vinod, A.P., Lee, S.-W., Guan, C.: Fbcnet: a multi-view convolutional neural network for brain-computer interface. arXiv:2104.01233, (2021)
Ma, X., Chen, W., Pei, Z., Liu, J., Huang, B., Chen, J.: A temporal dependency learning cnn with attention mechanism for mi-eeg decoding. IEEE Trans. Neural Syst. Rehabil. Eng. (2023)
Yang, L., Song, Y., Ma, K., Xie, L.: Motor imagery eeg decoding method based on a discriminative feature learning strategy. IEEE Trans. Neural Syst. Rehabil. Eng. 29, 368–379 (2021)
Han, J., Wei, X., Faisal, A.A.: Eeg decoding for datasets with heterogenous electrode configurations using transfer learning graph neural networks. J. Neural Eng. 20(6), 066027 (2023)
Yang, L., Song, Y., Ma, K., Enze, S., Xie, L.: A novel motor imagery eeg decoding method based on feature separation. J. Neural Eng. 18(3), 036022 (2021)
Song, Y., Jia, X., Yang, L., Xie, L.: Transformer-based spatial-temporal feature learning for eeg decoding. arXiv:2106.11170, (2021)
Akbari, H., Sadiq, M.T., Payan, M., Esmaili, S.S., Baghri, H., Bagheri, H.: Depression detection based on geometrical features extracted from sodp shape of eeg signals and binary pso. Traitement Du Signal 38(1), 1 (2021)
Akbari, H., Sadiq, M.T., Jafari, N., Too, J., Mikaeilvand, N., Cicone, A., Serra Capizzano, S.: Recognizing seizure using poincaré plot of eeg signals and graphical features in dwt domain. Bratisl. Med.l J. (2023)
Xie, W., Shen, L., Duan, J.: Adaptive weighting of handcrafted feature losses for facial expression recognition. IEEE Trans. Cybern. 51(5), 2787–2800 (2019)
Yang, L., Yang, H., Hu, B.-B., Wang, Y., Lv, C.: A robust driver emotion recognition method based on high-purity feature separation. IEEE Trans. Intell. Transp. Syst. (2023)
Wenmeng, Y., Hua, X.: Co-attentive multi-task convolutional neural network for facial expression recognition. Pattern Recogn. 123, 108401 (2022)
Sadeghi, H., Raie, A.-A.: Histnet: histogram-based convolutional neural network with chi-squared deep metric learning for facial expression recognition. Inf. Sci. 608, 472–488 (2022)
Yang, L., Tian, Y., Song, Y., Yang, N., Ma, K., Xie, L.: A novel feature separation model exchange-gan for facial expression recognition. Knowl. Based Syst. 204, 106217 (2020)
Mao, Q., Dong, M., Huang, Z., Zhan, Y.: Learning salient features for speech emotion recognition using convolutional neural networks. IEEE Trans. Multimed. 16(8), 2203–2213 (2014)
Chen, J.X., Jiang, D.M., Zhang, Y.N.: A hierarchical bidirectional gru model with attention for eeg-based emotion classification. IEEE Access 7, 118530–118540 (2019)
Xing, X., Li, Z., Tianyuan, X., Shu, L., Bin, H., Xiangmin, X.: Sae+ lstm: a new framework for emotion recognition from multi-channel eeg. Front. Neurorobot. 13, 37 (2019)
Jerritta, S., Murugappan, M., Khairunizam, W., Yaacob, S.: Electrocardiogram-based emotion recognition system using empirical mode decomposition and discrete fourier transform. Expert Syst. J. Knowl. Eng. 31(2), 110–120 (2014)
Seol, Y.-S., Kim, D.-J., Kim, H.-W.: Emotion recognition from text using knowledge-based ann. In: ITC-CSCC: International technical conference on circuits systems, Computers and Communications, pp. 1569–1572, (2008)
Peng, Y., Jin, F., Kong, W., Nie, F., Bao-Liang, L., Cichocki, A.: Ogssl: a semi-supervised classification model coupled with optimal graph learning for eeg emotion recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 30, 1288–1297 (2022)
Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xiangmin, X., Yang, X.: A review of emotion recognition using physiological signals. Sensors 18(7), 2074 (2018)
Jenke, R., Peer, A., Buss, M.: Feature extraction and selection for emotion recognition from eeg. IEEE Trans. Affect. Comput. 5(3), 327–339 (2014)
Subasi, A., Tuncer, T., Dogan, S., Tanko, D., Sakoglu, U.: Eeg-based emotion recognition using tunable q wavelet transform and rotation forest ensemble classifier. Biomed. Signal Process. Control 68, 102648 (2021)
Nalwaya, A., Das, K., Pachori, R.B.: Automated emotion identification using fourier-bessel domain-based entropies. Entropy 24(10), 1322 (2022)
Bhardwaj, A., Gupta, A., Jain, P., Rani, A., Yadav, J.: Classification of human emotions from eeg signals using svm and lda classifiers. In: 2015 2nd international conference on signal processing and integrated networks (SPIN), pp. 180–185. IEEE, (2015)
Pachori, R.B.: Eeg-based cross-subject emotion recognition using fourier-bessel series expansion based empirical wavelet transform and nca feature selection method. Inform. Sci. 610, 508 (2022)
Al-Shargie, F., Tariq, U., Alex, M., Mir, H., Al-Nashash, H.: Emotion recognition based on fusion of local cortical activations and dynamic functional networks connectivity: an eeg study. IEEE Access 7, 143550–143562 (2019)
Pachori, R.B.: Automated emotion recognition based on higher order statistics and deep learning algorithm. Biomed. Signal Process. Control 58, 101867 (2020)
Ballas, N., Yao, L., Pal, C., Courville, A.: Delving deeper into convolutional networks for learning video representations. arXiv:1511.06432, (2015)
Zheng, W.-L., Bao-Liang, L.: Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 7(3), 162–175 (2015)
Alakus, T.B., Turkoglu, I.J.E.L.: Emotion recognition with deep learning using gameemo data set. Electron. Lett. 56(25), 1364–1367 (2020)
Wei, C., Chen, L., Song, Z., Lou, X., Li, D.: Eeg-based emotion recognition using simple recurrent units network and ensemble learning. Biomed. Signal Process. Control 58, 101756 (2020)
Joshi, V.M., Ghongade, R.B., Joshi, A.M., Kulkarni, R.V.: Deep bilstm neural network model for emotion detection using cross-dataset approach. Biomed. Signal Process. Control 73, 103407 (2022)
Quan, J., Li, Y., Wang, L., He, R., Yang, S., Guo, L.: Eeg-based cross-subject emotion recognition using multi-source domain transfer learning. Biomed. Signal Process. Control 84, 104741 (2023)
Guo, W., Wang, Y.: Convolutional gated recurrent unit-driven multidimensional dynamic graph neural network for subject-independent emotion recognition. Expert Syst. Appl. 238, 121889 (2024)
Liu, S., Wang, Z., An, Y., Zhao, J., Zhao, Y., Zhang, Y.-D.: Eeg emotion recognition based on the attention mechanism and pre-trained convolution capsule network. Knowl. Based Syst. 265, 110372 (2023)
Pandey, P., Seeja, K.R.: Subject independent emotion recognition from eeg using vmd and deep learning. J. King Saud Univ. Comput. Inform. Sci. 34(5), 1730–1738 (2022)
Zhang, T., Zheng, W., Cui, Z., Zong, Y., Li, Y.: Spatial-temporal recurrent neural network for emotion recognition. IEEE Trans. Cybern. 49(3), 839–847 (2018)
Cui, H., Aiping Liu, X., Zhang, X.C., Wang, K., Chen, X.: Eeg-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network. Knowl. Based Syst. 205, 106243 (2020)
Maheshwari, D., Ghosh, S.K., Tripathy, R.K., Sharma, M., Acharya, U.R.: Automated accurate emotion recognition system using rhythm-specific deep convolutional neural network technique with multi-channel eeg signals. Comput. Biol. Med. 134, 104428 (2021)
Zhang, Y., Chen, J., Tan, J.H., Chen, Y., Chen, Y., Li, D., Yang, L., Jian, S., Huang, X., Che, W.: An investigation of deep learning models for eeg-based emotion recognition. Front. Neurosci. 14, 622759 (2020)
Li, X., Song, D., Zhang, P., Yu, G., Hou, Y., Hu, B.: Emotion recognition from multi-channel eeg data through convolutional recurrent neural network. In: 2016 IEEE international conference on bioinformatics and biomedicine (BIBM), pp. 352–359. IEEE, (2016)
Guixun, X., Guo, W., Wang, Y.: Subject-independent eeg emotion recognition with hybrid spatio-temporal gru-conv architecture. Med. Biol. Eng. Comput. 61(1), 61–73 (2023)
Rajpoot, A.S., Panicker, M.R., et al.: Subject independent emotion recognition using eeg signals employing attention driven neural networks. Biomed. Signal Process. Control 75, 103547 (2022)
Ganin, Y., Ustinova, E., Ajakan, H., Germain, P., Larochelle, H., Laviolette, F., March, M., Lempitsky, V.: Domain-adversarial training of neural networks. J. Mach. Learn. Res. 17(59), 1–35 (2016)
Bao, G., Zhuang, N., Tong, L., Yan, B., Shu, J., Wang, L., Zeng, Y., Shen, Z.: Two-level domain adaptation neural network for eeg-based emotion recognition. Front. Human Neurosci. 14, 605246 (2021)
Zhong, P., Wang, D., Miao, C.: Eeg-based emotion recognition using regularized graph neural networks. IEEE Trans. Affect. Comput. 13(3), 1290–1301 (2020)
Li, W., Tian, Y., Hou, B., Dong, J., Shao, S., Song, A.: A bi-stream hybrid model with mlpblocks and self-attention mechanism for eeg-based emotion recognition. Biomed. Signal Process. Control 86, 105223 (2023)
Shewalkar, A.N.: Comparison of rnn, lstm and gru on speech recognition data. (2018)
Alakus, T.B., Gonen, M., Turkoglu, I.: Database for an emotion recognition system based on eeg signals and various computer games-gameemo. Biomed. Signal Process. Control 60, 101951 (2020)
Nakisa, B., Rastgoo, M.N., Rakotonirainy, A., Maire, F., Chandran, V.: Automatic emotion recognition using temporal multimodal deep learning. IEEE Access 8, 225463–225474 (2020)
Rahman, M.M., Sarkar, A.K., Hossain, M.A., Moni, M.A.: Eeg-based emotion analysis using non-linear features and ensemble learning approaches. Expert Syst. Appl. 207, 118025 (2022)
Budak, A., Alçin, Ö.F., Aslan, M., Şengür, A.: Optic disc detection in retinal images via faster regional convolutional neural networks. In: 1st international engineering and technology symposium (IETS-2018), pp. 731–734, (2018)
Demir, F., Turkoglu, M., Aslan, M., Sengur, A.: A new pyramidal concatenated cnn approach for environmental sound classification. Appl. Acoust. 170, 107520 (2020)
Maeda-Gutiérrez, V., Galván-Tejada, C.E., Zanella-Calzada, L.A., Celaya-Padilla, J.M., Galván-Tejada, J.I., Gamboa-Rosales, H., Luna-Garcia, H., Magallanes-Quintanar, R., Guerrero Mendez, C.A., Olvera-Olvera, C.A.: Comparison of convolutional neural network architectures for classification of tomato plant diseases. Appl. Sci. 10(4), 1245 (2020)
Wang, S.-H., Phillips, P., Sui, Y., Liu, B., Yang, M., Cheng, H.: Classification of alzheimer’s disease based on eight-layer convolutional neural network with leaky rectified linear unit and max pooling. J. Med. Syst. 42, 1–11 (2018)
Mashrur, F.R., Islam, M.S., Saha, D.K., Islam, S.M.R., Moni, M.A.: Scnn: scalogram-based convolutional neural network to detect obstructive sleep apnea using single-lead electrocardiogram signals. Comput. Biol. Med. 134, 104532 (2021)
Syu, Y.-D., Wang, J.-C., Chou, C.-Y., Lin, M.-J., Liang, W.-C., Wu, L.-C., Jiang, J.-A.: Ultra-short-term wind speed forecasting for wind power based on gated recurrent unit. In: 2020 8th International electrical engineering congress (iEECON), pp. 1–4. IEEE, (2020)
Jipu, S., Zhu, J., Song, T., Chang, H.: Subject-independent eeg emotion recognition based on genetically optimized projection dictionary pair learning. Brain Sci. 13(7), 977 (2023)
Tao, X., Dang, W., Wang, J., Zhou, Y.: Dagam: a domain adversarial graph attention model for subject-independent eeg-based emotion recognition. J. Neural Eng. 20(1), 016022 (2023)
Guo, W., Guixun, X., Wang, Y.: Horizontal and vertical features fusion network based on different brain regions for emotion recognition. Knowl. Based Syst. 247, 108819 (2022)
Chen, H., Jin, M., Li, Z., Fan, C., Li, J., He, H.: Ms-mda: Multisource marginal distribution adaptation for cross-subject and cross-session eeg emotion recognition. Front. Neurosci. 15, 778488 (2021)
Li, Y., Chen, J., Li, F., Fu, B., Wu, H., Ji, Y., Zhou, Y., Niu, Y., Shi, G., Zheng, W.: Gmss: graph-based multi-task self-supervised learning for eeg emotion recognition. IEEE Trans. Affect. Comput. (2022)
Leite, D., Frigeri, V., Medeiros, R.: Adaptive gaussian fuzzy classifier for real-time emotion recognition in computer games. In: 2021 IEEE latin American conference on computational intelligence (LA-CCI), pp. 1–6. IEEE, (2021)
Abdulrahman, A., Baykara, M., Alakus, T.B.: A novel approach for emotion recognition based on eeg signal using deep learning. Appl. Sci. 12(19), 10028 (2022)
Aslan, M.: Cnn based efficient approach for emotion recognition. J. King Saud Univ. Comput. Inform. Sci. 34(9), 7335–7346 (2022)
Acknowledgements
The authors would like to express their grateful to Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2024R330), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.
Funding
This research was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2024R330), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia
Author information
Authors and Affiliations
Contributions
Essam H. Houssein: Supervising, Software, Methodology, Conceptualization, Formal analysis, Investigation, Visualization, Writing-review and editing. Asmaa Hammad: Formal analysis, Software, Methodology, visualization, resources, data curation, writing original draft, Writing-review and editing. Nagwan Abdel Samee: Formal analysis, Investigation, visualization, resources, Writing-review and editing. Manal Abdullah Alohali: Formal analysis, Investigation, visualization, resources, Writing-review and editing. Abdelmgeid A. Ali: Supervision. All authors read and approved the final article.
Corresponding author
Ethics declarations
Conflict of interest
The authors have declared that there are no Conflict of interest.
Human and animal rights
This article does not contain studies with human participants or animals carried out by any of the authors.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Houssein, E.H., Hammad, A., Samee, N.A. et al. TFCNN-BiGRU with self-attention mechanism for automatic human emotion recognition using multi-channel EEG data. Cluster Comput 27, 14365–14385 (2024). https://doi.org/10.1007/s10586-024-04590-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10586-024-04590-5