Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Personal-Zscore: Eliminating Individual Difference for EEG-Based Cross-Subject Emotion Recognition

Published: 01 July 2023 Publication History

Abstract

It was observed that accuracy of the Subject-Dependent emotion recognition model was much higher than that of the Subject-Independent model in the field of electroencephalogram (EEG) based affective computing. This phenomenon is mainly caused by the individual difference of EEG, which is the key issue to be solved for the application of emotion recognition. In this work, 14 subjects from the SEED were selected for individual difference analysis. Through individual aggregation features evaluation, sample space visualization, and correlation analysis, we proposed four quantification indicators to analyze individual difference phenomenon. Finally, we presented the Personal-Zscore (PZ) feature processing method, and it was found that the data set processed with PZ method could represent emotion better than the original data set, and the conventional model with the PZ method was more robust. The accuracies of emotion recognition models trained with PZ processing have been improved to some extent, which showed that the PZ method could effectively eliminate the individual aggregation of feature space and improve the emotional representation ability of data sets. Hence, our findings may provide a new insight into the foundation for universal implementation of EEG-based application, and the Personal-Zscore feature processing method is of great significance for the development of effective emotion recognition system.

References

[1]
M. S. Hossain and G. Muhammad, “Emotion recognition using deep learning approach from audio–visual emotional Big Data,” Inf. Fusion, vol. 49, pp. 69–78, 2019.
[2]
G. L. Ahern and G. E. Schwartz, “Differential lateralization for positive and negative emotion in the human brain: EEG spectral analysis,” Neuropsychologia, vol. 23, no. 6, pp. 745–755, 1985.
[3]
X. Huang et al., “Multi-modal emotion analysis from facial expressions and electroencephalogram,” Comput. Vis. Image Understanding, vol. 147, pp. 114–124, 2016.
[4]
A. Al-Nafjan, M. Hosny, Y. Al-Ohali, and A. Al-Wabil, “Review and classification of emotion recognition based on EEG brain-computer interface system research: A systematic review,” Appl. Sci., vol. 7, no. 12, 2017, Art. no.
[5]
D. Nath, M. Singh, D. Sethia, D. Kalra, and S. Indu, “A comparative study of subject-dependent and subject-independent strategies for EEG-based emotion recognition using LSTM network,” in Proc. 4th Int. Conf. Comput. Data Anal., 2020, pp. 142–147.
[6]
M. Akin and M. K. Kiymik, “Application of periodogram and AR spectral analysis to EEG signals,” J. Med. Syst., vol. 24, no. 4, pp. 247–256, 2000.
[7]
C. Babiloni et al., “Brain neural synchronization and functional coupling in alzheimer’s disease as revealed by resting state EEG rhythms,” Int. J. Psychophysiol., vol. 103, pp. 88–102, 2016.
[8]
B. Hjorth, “The physical significance of time domain descriptors in EEG analysis,” Electroencephalogr. Clin. Neurophysiol., vol. 34, no. 3, pp. 321–325, 1973.
[9]
R. M. Mehmood, R. Du, and H. J. Lee, “Optimal feature selection and deep learning ensembles method for emotion recognition from human brain EEG sensors,” IEEE Access, vol. 5, pp. 14 797–14 806, 2017.
[10]
X. Jie, R. Cao, and L. Li, “Emotion recognition based on the sample entropy of EEG,” Bio-Med. Materials Eng., vol. 24, no. 1, pp. 1185–1192, 2014.
[11]
D. A. Engemann et al., “Robust EEG-based cross-site and cross-protocol classification of states of consciousness,” Brain, vol. 141, no. 11, pp. 3179–3192, 2018.
[12]
M. J. Katz, “Fractals and the analysis of waveforms,” Comput. Biol. Medicine, vol. 18, no. 3, pp. 145–156, 1988.
[13]
S. A. Akar, S. Kara, S. Agambayev, and V. Bilgiç, “Nonlinear analysis of EEGs of patients with major depression during different emotional states,” Comput. Biol. Medicine, vol. 67, pp. 49–60, 2015.
[14]
V. Jayaram, M. Alamgir, Y. Altun, B. Scholkopf, and M. Grosse-Wentrup, “Transfer learning in brain-computer interfaces,” IEEE Comput. Intell. Mag., vol. 11, no. 1, pp. 20–31, Feb. 2016.
[15]
W.-L. Zheng and B.-L. Lu, “Personalizing EEG-based affective models with transfer learning,” in Proc. 25th Int. Joint Conf. Artif. Intell., 2016, pp. 2732–2738.
[16]
J. Li, Z. Zhang, and H. He, “Hierarchical convolutional neural networks for EEG-based emotion recognition,” Cogn. Comput., vol. 10, no. 2, pp. 368–380, 2018.
[17]
J. A. Meltzer, M. Negishi, L. C. Mayes, and R. T. Constable, “Individual differences in EEG theta and alpha dynamics during working memory correlate with FMRI responses across subjects,” Clin. Neurophysiol., vol. 118, no. 11, pp. 2419–2436, 2007.
[18]
X. Zhang et al., “Emotion recognition from multimodal physiological signals using a regularized deep fusion of kernel machine,” IEEE Trans. Cybern., vol. 51, no. 9, pp. 4386–4399, Sep. 2021.
[19]
P. Zhong, D. Wang, and C. Miao, “EEG-based emotion recognition using regularized graph neural networks,” IEEE Trans. Affective Comput., to be published.
[20]
S. Liu, X. Wang, L. Zhao, J. Zhao, Q. Xin, and S. Wang, “Subject-independent emotion recognition of EEG signals based on dynamic empirical convolutional neural network,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol. 18, no. 5, pp. 1710–1721, Sep./Oct. 2021.
[21]
G. Zhang, M. Yu, Y.-J. Liu, G. Zhao, D. Zhang, and W. Zheng, “Sparsedgcnn: Recognizing emotion from multichannel EEG signals,” IEEE Trans. Affective Comput., to be published.
[22]
T. Song et al., “Variational instance-adaptive graph for EEG emotion recognition,” IEEE Trans. Affective Comput., to be published.
[23]
W.-L. Zheng, J.-Y. Zhu, and B.-L. Lu, “Identifying stable patterns over time for emotion recognition from EEG,” IEEE Trans. Affective Comput., vol. 10, no. 3, pp. 417–429, Third Quarter 2019.
[24]
R.-N. Duan, J.-Y. Zhu, and B.-L. Lu, “Differential entropy feature for EEG-based emotion classification,” in Proc. 6th Int. IEEE/EMBS Conf. Neural Eng., 2013, pp. 81–84.
[25]
L.-C. Shi, Y.-Y. Jiao, and B.-L. Lu, “Differential entropy feature for EEG-based vigilance estimation,” in Proc. 35th Annu. Int. Conf. IEEE Eng. Medicine Biol. Soc., 2013, pp. 6627–6630.
[26]
J. Li, S. Qiu, Y.-Y. Shen, C.-L. Liu, and H. He, “Multisource transfer learning for cross-subject EEG emotion recognition,” IEEE Trans. Cybern., vol. 50, no. 7, pp. 3281–3293, Jul. 2020.
[27]
A. Onishi, K. Takano, T. Kawase, H. Ora, and K. Kansaku, “Affective stimuli for an auditory P300 brain-computer interface,” Front. Neurosci., vol. 11, 2017, Art. no.
[28]
Y. Sun, H. Ayaz, and A. N. Akansu, “Multimodal affective state assessment using fNIRS+ EEG and spontaneous facial expression,” Brain Sci., vol. 10, no. 2, 2020, Art. no.
[29]
Z. Lan, O. Sourina, L. Wang, R. Scherer, and G. R. Müller-Putz, “Domain adaptation techniques for EEG-based emotion recognition: A comparative study on two public datasets,” IEEE Trans. Cogn. Develop. Syst., vol. 11, no. 1, pp. 85–94, Mar. 2019.
[30]
S. Koelstra et al., “DEAP: A database for emotion analysis; using physiological signals,” IEEE Trans. Affective Comput., vol. 3, no. 1, pp. 18–31, First Quarter 2012.
[31]
M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A multimodal database for affect recognition and implicit tagging,” IEEE Trans. Affective Comput., vol. 3, no. 1, pp. 42–55, First Quarter 2012.
[32]
S. Katsigiannis and N. Ramzan, “DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices,” IEEE J. Biomed. Health Informat., vol. 22, no. 1, pp. 98–107, Jan. 2018.
[33]
X. Li, D. Song, P. Zhang, Y. Zhang, Y. Hou, and B. Hu, “Exploring EEG features in cross-subject emotion recognition,” Front. Neurosci., vol. 12, 2018, Art. no.
[34]
P. Arnau-González, M. Arevalillo-Herráez, and N. Ramzan, “Fusing highly dimensional energy and connectivity features to identify affective states from EEG signals,” Neurocomputing, vol. 244, pp. 81–89, 2017.
[35]
W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Trans. Auton. Mental Develop., vol. 7, no. 3, pp. 162–175, Sep. 2015.
[36]
Z. Lan, Y. Liu, O. Sourina, L. Wang, R. Scherer, and G. Müller-Putz, “SAFE: An EEG dataset for stable affective feature selection,” Adv. Eng. Informat., vol. 44, 2020, Art. no.
[37]
W. Lim, O. Sourina, and L. Wang, “STEW: Simultaneous task EEG workload data set,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 26, no. 11, pp. 2106–2114, Nov. 2018.
[38]
Y. Li, W. Zheng, Z. Cui, Y. Zong, and S. Ge, “EEG emotion recognition based on graph regularized sparse linear regression,” Neural Process. Lett., vol. 49, no. 2, pp. 555–571, 2019.
[39]
A. Pedroni, A. Bahreini, and N. Langer, “Automagic: Standardized preprocessing of big EEG data,” NeuroImage, vol. 200, pp. 460–473, 2019.
[40]
L. C. Parra, C. D. Spence, A. D. Gerson, and P. Sajda, “Recipes for the linear analysis of EEG,” NeuroImage, vol. 28, no. 2, pp. 326–341, 2005.
[41]
R. J. Croft and R. J. Barry, “EOG correction: Which regression should we use?” Psychophysiology, vol. 37, no. 1, pp. 123–125, 2000.
[42]
I. Winkler, S. Haufe, and M. Tangermann, “Automatic classification of artifactual ICA-components for artifact removal in EEG signals,” Behavioral Brain Functions, vol. 7, no. 1, 2011, Art. no.
[43]
I. Winkler, S. Brandl, F. Horn, E. Waldburger, C. Allefeld, and M. Tangermann, “Robust artifactual independent component classification for BCI practitioners,” J. Neural Eng., vol. 11, no. 3, 2014, Art. no.
[44]
A. X. Stewart, A. Nuthmann, and G. Sanguinetti, “Single-trial classification of EEG in a visual object task using ICA and machine learning,” J. Neurosci. Methods, vol. 228, pp. 1–14, 2014.
[45]
N. Bigdely-Shamlo, T. Mullen, C. Kothe, K.-M. Su, and K. A. Robbins, “The prep pipeline: Standardized preprocessing for large-scale EEG analysis,” Front. Neuroinformatics, vol. 9, 2015, Art. no.
[46]
M. Stikic, R. R. Johnson, V. Tan, and C. Berka, “EEG-based classification of positive and negative affective states,” Brain-Comput. Interfaces, vol. 1, no. 2, pp. 99–112, 2014.
[47]
Z. Lan, O. Sourina, L. Wang, and Y. Liu, “Real-time EEG-based emotion monitoring using stable features,” The Vis. Comput., vol. 32, no. 3, pp. 347–358, 2016.
[48]
L. V. D. Maaten and G. Hinton, “Visualizing data using t-SNE,” J. Mach. Learn. Res., vol. 9, no. 11, pp. 2579–2605, 2008.
[49]
G. H. Dunteman, Principal Components Analysis. Newbury Park, CA, USA: Sage, 1989.
[50]
O. Bălan, G. Moise, A. Moldoveanu, M. Leordeanu, and F. Moldoveanu, “Fear level classification based on emotional dimensions and machine learning techniques,” Sensors, vol. 19, no. 7, 2019, Art. no.
[51]
A. David and S. Vassilvitskii, “K-means++: The advantages of careful seeding,” in Proc. 18th Annu. ACM-SIAM Symp. Discrete Algorithms, 2007, pp. 1027–1035.
[52]
S. Lloyd, “Least squares quantization in PCM,” IEEE Trans. Inf. Theory, vol. 28, no. 2, pp. 129–137, Mar. 1982.
[53]
J. Elder and D. Pregibon, “A statistical perspective on kdd,” in Advances in Knowledge Discovery and Data Mining, Palo Alto, CA, USA: AAAI Press, 1996, pp. 83–116.
[54]
B. Hu, X. Li, S. Sun, and M. Ratcliffe, “Attention recognition in EEG-based affective learning research using CFS+ KNN algorithm,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol. 15, no. 1, pp. 38–45, Jan./Feb. 2018.
[55]
M. Robnik-Šikonja and I. Kononenko, “Theoretical and empirical analysis of ReliefF and RReliefF,” Mach. Learn., vol. 53, no. 1/2, pp. 23–69, 2003.
[56]
W. Huang et al., “Sleep staging algorithm based on multichannel data adding and multifeature screening,” Comput. Methods Programs Biomed., vol. 187, 2020, Art. no.
[57]
B. Üstün, W. J. Melssen, and L. M. Buydens, “Facilitating the application of support vector regression by using a universal pearson VII function based kernel,” Chemometrics Intell. Lab. Syst., vol. 81, no. 1, pp. 29–40, 2006.
[58]
Y. Zhang et al., “Multi-kernel extreme learning machine for EEG classification in brain-computer interfaces,” Expert Syst. Appl., vol. 96, pp. 302–310, 2018.
[59]
Y. Li, X.-D. Wang, M.-L. Luo, K. Li, X.-F. Yang, and Q. Guo, “Epileptic seizure classification of EEGs using time–frequency analysis based multiscale radial basis functions,” IEEE J. Biomed. Health Informat., vol. 22, no. 2, pp. 386–397, Mar. 2018.
[60]
S. Walter, J. Kim, D. Hrabal, S. C. Crawcour, H. Kessler, and H. C. Traue, “Transsituational individual-specific biopsychological classification of emotions,” IEEE Trans. Syst., Man, Cybern.: Syst., vol. 43, no. 4, pp. 988–995, Jul. 2013.
[61]
R. Li, L. Wang, and O. Sourina, “Subject matching for cross-subject EEG-based recognition of driver states related to situation awareness,” Methods, vol. S1046–2023, pp. 101–108, 2021.
[62]
Y.-P. Lin and T.-P. Jung, “Improving EEG-based emotion classification using conditional transfer learning,” Front. Hum. Neurosci., vol. 11, 2017, Art. no.
[63]
C.-H. Wu, Y.-M. Huang, and J.-P. Hwang, “Review of affective computing in education/learning: Trends and challenges,” Brit. J. Educ. Technol., vol. 47, no. 6, pp. 1304–1323, 2016.
[64]
W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, “EmotionMeter: A multimodal framework for recognizing human emotions,” IEEE Trans. Cybern., vol. 49, no. 3, pp. 1110–1122, Mar. 2019.
[65]
T. Song, W. Zheng, C. Lu, Y. Zong, X. Zhang, and Z. Cui, “MPED: A multi-modal physiological emotion database for discrete emotion recognition,” IEEE Access, vol. 7, pp. 12 177–12 191, 2019.
[66]
L. Pion-Tonachini, K. Kreutz-Delgado, and S. Makeig, “ICLabel: An automated electroencephalographic independent component classifier, dataset, and website,” NeuroImage, vol. 198, pp. 181–197, 2019.

Cited By

View all
  • (2024)Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer LearningIEEE Transactions on Affective Computing10.1109/TAFFC.2023.333653115:3(1315-1330)Online publication date: 1-Jul-2024
  • (2023)Sparse Bayesian Learning for End-to-End EEG DecodingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.329956845:12(15632-15649)Online publication date: 1-Dec-2023

Recommendations

Comments

Information & Contributors

Information

Published In

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 01 July 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 03 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer LearningIEEE Transactions on Affective Computing10.1109/TAFFC.2023.333653115:3(1315-1330)Online publication date: 1-Jul-2024
  • (2023)Sparse Bayesian Learning for End-to-End EEG DecodingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.329956845:12(15632-15649)Online publication date: 1-Dec-2023

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media