Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

GDDN: Graph Domain Disentanglement Network for Generalizable EEG Emotion Recognition

Published: 01 July 2024 Publication History

Abstract

Cross-subject EEG emotion recognition suffers a major setback due to high inter-subject variability in emotional responses. Many prior studies have endeavored to alleviate the inter-subject discrepancies of EEG feature distributions, ignoring the variable EEG connectivity and prediction deviation caused by individual differences, which may cause poor generalization to the unseen subject. This article proposes a graph domain disentanglement network (GDDN) to generalize EEG emotion recognition across subjects in terms of EEG connectivity, representation, and prediction. More specifically, a graph domain disentanglement module is proposed to extract common-specific characteristics on both EEG graph connectivity and graph representation, enabling a more comprehensive network transferability to the unseen individual. Meanwhile, to strengthen stable emotion prediction capability, a domain-adaptive classifier aggregation module is developed to facilitate adaptive emotional prediction for the unseen individual conditioned on the domain weights of the input individuals. Finally, an auxiliary supervision module is imposed to alleviate the domain discrepancy and reduce information loss during the disentanglement learning. Extensive experiments on three public EEG emotion datasets, i.e., SEED, SEED-IV, and MPED, validate the superior generalizability of GDDN compared with the state-of-the-art methods.

References

[1]
R. B. Briner and P. Totterdell, “The experience, expression and management of emotion at work,” in Psychology at Work. London, U.K.: Penguin Press, 2002, pp. 229–252.
[2]
C. M. Tyng, H. U. Amin, M. N. Saad, and A. S. Malik, “The influences of emotion on learning and memory,” Front. Psychol., vol. 8, 2017, Art. no.
[3]
J. S. Lerner, Y. Li, P. Valdesolo, and K. S. Kassam, “Emotion and decision making,” Annu. Rev. Psychol., vol. 66, pp. 799–823, 2015.
[4]
J. C. Britton, K. L. Phan, S. F. Taylor, R. C. Welsh, K. C. Berridge, and I. Liberzon, “Neural correlates of social and nonsocial emotions: An fMRI study,” Neuroimage, vol. 31, no. 1, pp. 397–409, 2006.
[5]
M. Z. Soroush, K. Maghooli, S. K. Setarehdan, and A. M. Nasrabadi, “A review on EEG signals based emotion recognition,” Int. Clin. Neurosci. J., vol. 4, no. 4, pp. 118–129, 2017.
[6]
S. M. Alarcao and M. J. Fonseca, “Emotions recognition using EEG signals: A survey,” IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 374–393, Third Quarter 2019.
[7]
U. R. Acharya, V. K. Sudarshan, H. Adeli, J. Santhosh, J. E. Koh, and A. Adeli, “Computer-aided diagnosis of depression using EEG signals,” Eur. Neurol., vol. 73, no. 5/6, pp. 329–336, 2015.
[8]
Y.-P. Lin et al., “EEG-based emotion recognition in music listening,” IEEE Trans. Biomed. Eng., vol. 57, no. 7, pp. 1798–1806, Jul. 2010.
[9]
Y. Roy, H. Banville, I. Albuquerque, A. Gramfort, T. H. Falk, and J. Faubert, “Deep learning-based electroencephalography analysis: A systematic review,” J. Neural Eng., vol. 16, no. 5, 2019, Art. no.
[10]
R. T. Schirrmeister et al., “Deep learning with convolutional neural networks for EEG decoding and visualization,” Hum. Brain Mapping, vol. 38, no. 11, pp. 5391–5420, 2017.
[11]
T. Zhang, W. Zheng, Z. Cui, Y. Zong, and Y. Li, “Spatial–temporal recurrent neural network for emotion recognition,” IEEE Trans. Cybern., vol. 49, no. 3, pp. 839–847, Mar. 2019.
[12]
T. Song, W. Zheng, P. Song, and Z. Cui, “EEG emotion recognition using dynamical graph convolutional neural networks,” IEEE Trans. Affect. Comput., vol. 11, no. 3, pp. 532–541, Third Quarter 2020.
[13]
T. Zhang, X. Wang, X. Xu, and C. P. Chen, “GCB-Net: Graph convolutional broad network and its application in emotion recognition,” IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 379–388, First Quarter 2022.
[14]
Y. Li, W. Zheng, Y. Zong, Z. Cui, T. Zhang, and X. Zhou, “A bi-hemisphere domain adversarial neural network model for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 494–504, Second Quarter 2021.
[15]
Y. Li et al., “A novel bi-hemispheric discrepancy model for EEG emotion recognition,” IEEE Trans. Cogn. Devel. Syst., vol. 13, no. 2, pp. 354–367, Jun. 2021.
[16]
P. Zhong, D. Wang, and C. Miao, “EEG-based emotion recognition using regularized graph neural networks,” IEEE Trans. Affect. Comput., vol. 13, no. 3, pp. 1290–1301, Third Quarter 2022.
[17]
H. Chen, M. Jin, Z. Li, C. Fan, J. Li, and H. He, “MS-MDA: Multisource marginal distribution adaptation for cross-subject and cross-session EEG emotion recognition,” Front. Neurosci., vol. 15, 2021, Art. no.
[18]
L.-M. Zhao, X. Yan, and B.-L. Lu, “Plug-and-play domain adaptation for cross-subject EEG-based emotion recognition,” in Proc. AAAI Conf. Artif. Intell., 2021, pp. 863–870.
[19]
D. Bethge et al., “Domain-invariant representation learning from EEG with private encoders,” in Proc. IEEE Int. Conf. Acoust. Speech Signal Process., 2022, pp. 1236–1240.
[20]
B.-Q. Ma, H. Li, W.-L. Zheng, and B.-L. Lu, “Reducing the subject variability of EEG signals with adversarial domain generalization,” in Proc. 26th Int. Conf. Neural Inf. Process., 2019, pp. 30–42.
[21]
T. Song, S. Liu, W. Zheng, Y. Zong, and Z. Cui, “Instance-adaptive graph for EEG emotion recognition,” in Proc. AAAI Conf. Artif. Intell., 2020, pp. 2701–2708.
[22]
T. Song et al., “Variational instance-adaptive graph for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 14, no. 1, pp. 343–356, First Quarter 2023.
[23]
C. Gratton et al., “Functional brain networks are dominated by stable group and individual factors, not cognitive or daily variation,” Neuron, vol. 98, no. 2, pp. 439–452, 2018.
[24]
B. Hu, X. Li, S. Sun, and M. Ratcliffe, “Attention recognition in EEG-based affective learning research using CFS KNN algorithm,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol. 15, no. 1, pp. 38–45, Jan./Feb. 2018.
[25]
J. Liu, X. Shen, S. Song, and D. Zhang, “Domain adaptation for cross-subject emotion recognition by subject clustering,” in Proc. 10th Int. IEEE/EMBS Conf. Neural Eng., 2021, pp. 904–908.
[26]
J. A. Suykens and J. Vandewalle, “Least squares support vector machine classifiers,” Neural Process. Lett., vol. 9, no. 3, pp. 293–300, 1999.
[27]
V. Jayaram, M. Alamgir, Y. Altun, B. Scholkopf, and M. Grosse-Wentrup, “Transfer learning in brain-computer interfaces,” IEEE Comput. Intell. Mag., vol. 11, no. 1, pp. 20–31, Feb. 2016.
[28]
W.-L. Zheng and B.-L. Lu, “Personalizing EEG-based affective models with transfer learning,” in Proc. 25th Int. Joint Conf. Artif. Intell., 2016, pp. 2732–2738.
[29]
H. Li, Y.-M. Jin, W.-L. Zheng, and B.-L. Lu, “Cross-subject emotion recognition using deep adaptation networks,” in Proc. 25th Int. Conf. Neural Inf. Process., 2018, pp. 403–413.
[30]
Z. Li et al., “Dynamic domain adaptation for class-aware cross-subject and cross-session EEG emotion recognition,” IEEE J. Biomed. Health Informat., vol. 26, no. 12, pp. 5964–5973, Dec. 2022.
[31]
S. Mueller et al., “Individual variability in functional connectivity architecture of the human brain,” Neuron, vol. 77, no. 3, pp. 586–595, 2013.
[32]
M. Jin, E. Zhu, C. Du, H. He, and J. Li, “PGCN: Pyramidal graph convolutional network for EEG emotion recognition,” 2023,.
[33]
M. Ye, C. L. P. Chen, and T. Zhang, “Hierarchical dynamic graph convolutional network with interpretability for EEG-based emotion recognition,” IEEE Trans. Neural Netw. Learn. Syst., to be published.
[34]
W.-L. Zheng, J.-Y. Zhu, and B.-L. Lu, “Identifying stable patterns over time for emotion recognition from EEG,” IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 417–429, Third Quarter 2019.
[35]
P. S. Goldman-Rakic, “Topography of cognition: Parallel distributed networks in primate association cortex,” Annu. Rev. Neurosci., vol. 11, no. 1, pp. 137–156, 1988.
[36]
Z. Li, Z. Cui, S. Wu, X. Zhang, and L. Wang, “Fi-GNN: Modeling feature interactions via graph neural networks for ctr prediction,” in Proc. 28th ACM Int. Conf. Inf. Knowl. Manage., 2019, pp. 539–548.
[37]
D. Nathani, J. Chauhan, C. Sharma, and M. Kaul, “Learning attention-based embeddings for relation prediction in knowledge graphs,” in Proc. 57th Annu. Meeting Assoc. Comput. Linguistics, 2019, pp. 4710–4723.
[38]
T. Chen, R. Hong, Y. Guo, S. Hao, and B. Hu, “MS -GNN: Exploring GNN-based multimodal fusion network for depression detection,” IEEE Trans. Cybern., vol. 53, no. 12, pp. 7749–7759, Dec. 2023.
[39]
F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The graph neural network model,” IEEE Trans. Neural Netw., vol. 20, no. 1, pp. 61–80, Jan. 2009.
[40]
J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun, “Spectral networks and locally connected networks on graphs,” in Proc. 2nd Int. Conf. Learn. Representations, 2014, pp. 1–14.
[41]
M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Proc. Adv. Neural Inf. Process. Syst., 2016, pp. 3837–3845.
[42]
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. 5th Int. Conf. Learn. Representations, 2017, pp. 1–14.
[43]
Q. Li, T. Zhang, C. P. Chen, K. Yi, and L. Chen, “Residual GCB-Net: Residual graph convolutional broad network on emotion recognition,” IEEE Trans. Cogn. Devel. Syst., vol. 15, no. 4, pp. 1673–1685, Dec. 2023.
[44]
Y. Li et al., “GMSS: Graph-based multi-task self-supervised learning for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 14, no. 3, pp. 2512–2525, Third Quarter 2023.
[45]
Y. Lu, W. Zheng, B. Li, and B. Lu, “Combining eye movements and EEG to enhance emotion recognition,” in Proc. 24th Int. Joint Conf. Artif. Intell., 2015, pp. 1170–1176.
[46]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, “Graph attention networks,” in Proc. Int. Conf. Learn. Representations, 2018, pp. 1–12.
[47]
N. Wagh and Y. Varatharajah, “EEG-GCNN: Augmenting electroencephalogram-based neurological disease diagnosis using a domain-guided graph convolutional neural network,” in Proc. Mach. Learn. Health Workshop, 2020, pp. 367–378.
[48]
Q. Li, Z. Han, and X.-M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” in Proc. AAAI Conf. Artif. Intell., 2018, pp. 3538–3545.
[49]
Y. Yao and G. Doretto, “Boosting for transfer learning with multiple sources,” in Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., 2010, pp. 1855–1862.
[50]
Y. Ganin et al., “Domain-adversarial training of neural networks,” J. Mach. Learn. Res., vol. 17, no. 1, pp. 2096–2030, 2016.
[51]
J. Li, S. Li, J. Pan, and F. Wang, “Cross-subject EEG emotion recognition with self-organized graph neural network,” Front. Neurosci., vol. 15, 2021, Art. no.
[52]
S. J. Pan, I. W. Tsang, J. T. Kwok, and Q. Yang, “Domain adaptation via transfer component analysis,” IEEE Trans. Neural Netw., vol. 22, no. 2, pp. 199–210, Feb. 2011.
[53]
B. Fernando, A. Habrard, M. Sebban, and T. Tuytelaars, “Unsupervised visual domain adaptation using subspace alignment,” in Proc. IEEE Int. Conf. Comput. Vis., 2013, pp. 2960–2967.
[54]
L. Zhu et al., “Multisource wasserstein adaptation coding network for EEG emotion recognition,” Biomed. Signal Process. Control, vol. 76, 2022, Art. no.
[55]
W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Trans. Auton. Ment. Develop., vol. 7, no. 3, pp. 162–175, Sep. 2015.
[56]
W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, “EmotionMeter: A multimodal framework for recognizing human emotions,” IEEE Trans. Cybern., vol. 49, no. 3, pp. 1110–1122, Mar. 2019.
[57]
T. Song, W. Zheng, C. Lu, Y. Zong, X. Zhang, and Z. Cui, “MPED: A multi-modal physiological emotion database for discrete emotion recognition,” IEEE Access, vol. 7, pp. 12177–12191, 2019.
[58]
W. Guo, G. Xu, and Y. Wang, “Multi-source domain adaptation with spatio-temporal feature extractor for EEG emotion recognition,” Biomed. Signal Process. Control, vol. 84, 2023, Art. no.
[59]
E. Tzeng, J. Hoffman, N. Zhang, K. Saenko, and T. Darrell, “Deep domain confusion: Maximizing for domain invariance,” 2014,.
[60]
Y. Li, B. Fu, F. Li, G. Shi, and W. Zheng, “A novel transferability attention neural network model for EEG emotion recognition,” Neurocomputing, vol. 447, pp. 92–101, 2021.
[61]
H. Chen, Z. Li, M. Jin, and J. Li, “MEERNet: Multi-source EEG-based emotion recognition network for generalization across subjects and sessions,” in Proc. IEEE 43rd Annu. Int. Conf. Eng. Med. Biol. Soc., 2021, pp. 6094–6097.
[62]
M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch geometric,” in Proc. ICLR Workshop Representation Learn. Graphs Manifolds, 2019, pp. 1–9.
[63]
D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in Proc. 3rd Int. Conf. Learn. Representations, 2015.
[64]
T. Kanamori, S. Hido, and M. Sugiyama, “A least-squares approach to direct importance estimation,” J. Mach. Learn. Res., vol. 10, pp. 1391–1445, 2009.
[65]
X. Zhang, G. Cheng, and Y. Qu, “Ontology summarization based on RDF sentence graph,” in Proc. 16th Int. Conf. World Wide Web, 2007, pp. 707–716.
[66]
L. Van der Maaten and G. Hinton, “Visualizing data using t-SNE,” J. Mach. Learn. Res., vol. 9, no. 11, 2008, Art. no.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing  Volume 15, Issue 3
July-Sept. 2024
1087 pages

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 01 July 2024

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 12 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media