Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

A multi-relational neighbors constructed graph neural network for heterophily graph learning

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Graph neural networks (GNNs) have shown great power in exploring graph representation. However, most current GNNs are based on the homophily assumption and they have two primary weaknesses when applied to heterophily graphs: difficult to capture long-range dependence and unable to distinguish spatial relationships of neighbors. In an attempt to address these issues, we propose a multi-relational neighbors constructed graph neural network (MRN-GNN). Our core components, neighbor reconstruction and the bi-level attention aggregation mechanism, provide an effective way to enhance the ability to express heterophily graphs. Specifically, for neighbor reconstruction, we establish connections between node pairs with highly similar features, making it possible to capture long-range dependences. Meanwhile, we construct multi-relational neighbors for each node to distinguish different spatial structure of neighbors. Based on the reconstructed graph, a bi-level aggregation scheme is proposed to enable hierarchical aggregation, facilitating better feature transmission among multi-relational nodes. During this process, an attention mechanism is built to dynamically assign weights to each neighbor under different relations, further strengthening the representation capability. In this work, we focus on the node classification task on heterophily graphs. We conduct comprehensive experiments on seven datasets, including both heterophily and homophily datasets. Compared with representative methods, our MRN-GNN demonstrates significant superiority on heterophily graphs, while also achieving competitive results on homophily graphs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Algorithm 2
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Available Statement

All the data used in this work are publicly available to the researchers through the works [30, 48].

References

  1. Li S, Zhou J, Xu T et al (2021) Structure-aware interactive graph neural networks for the prediction of protein-ligand binding affinity. In: Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining, pp 975–985

  2. Li M, Micheli A, Wang YG, Pan S, Lió P, Gnecco GS, Sanguineti M (2024) Guest editorial: Deep neural networks for graphs: Theory, models, algorithms, and applications. IEEE Trans Neural Netw Learn Syst 35(4):4367–4372

    Article  MATH  Google Scholar 

  3. Wu Z, Pan S, Chen F, Long G, Zhang C, Yu PS (2021) A comprehensive survey on graph neural networks. IEEE Trans Neural Netw Learn Syst 32(1):4–24

    Article  MathSciNet  MATH  Google Scholar 

  4. Wang H, Xu T, Liu Q, Lian D, Chen E, Du D, Wu H, Su W (2019) Mcne: an end-to-end framework for learning multiple conditional network representations of social network. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1064–1072

  5. Yan D, Xie W, Zhang Y (2022) Heterogeneous information network-based interest composition with graph neural network for recommendation. Appl Intell 52(10):11199–11213

    Article  MATH  Google Scholar 

  6. Zhou J, Cui G, Hu S et al (2020) Graph neural networks: A review of methods and applications. AI Open 1:57–81

    Article  MATH  Google Scholar 

  7. Chen H, Yeh CCM, Wang F et al (2022) Graph neural transport networks with non-local attentions for recommender systems. In: Proceedings of the ACM web conference, pp 1955–1964

  8. Li M, Zhang L, Cui L, Bai L, Li Z, Wu X (2023) Blog: Bootstrapped graph representation learning with local and global regularization for recommendation. Pattern Recognit 144:109874

  9. Cao Z, Xu Q, Yang Z, Cao X, Huang Q (2021) Dual quaternion knowledge graph embeddings. Proc AAAI Conf Artif Intell 35:6894–6902

    MATH  Google Scholar 

  10. Luo Y, Luo G, Yan K et al (2022) Inferring from references with differences for semi-supervised node classification on graphs. Mathematics 10(8):1262

    Article  MATH  Google Scholar 

  11. Luo Y, Chen A, Yan K et al (2021) Distilling self-knowledge from contrastive links to classify graph nodes without passing messages. arXiv preprint arXiv:2106.08541

  12. Wang H, Leskovec J (2020) Unifying graph convolutional neural networks and label propagation. arXiv preprint arXiv:2002.06755

  13. Izadi MR, Fang Y, Stevenson R et al (2020) Optimization of graph neural networks with natural gradient descent. In: 2020 IEEE International conference on big data (big Data), pp 171–179

  14. Wang B, Shen T, Long G et al (2021) Structure-augmented text representation learning for efficient knowledge graph completion. In: Proceedings of the web conference, pp 1737–1748

  15. Clouatre L, Trempe P, Zouaq A et al (2020) Mlmlm: Link prediction with mean likelihood masked language model. arXiv preprint arXiv:2009.07058

  16. Pinter Y, Eisenstein J (2018) Predicting semantic relations using global graph properties. arXiv preprint arXiv:1808.08644

  17. Nouranizadeh A, Matinkia M, Rahmati M et al (2021) Maximum entropy weighted independent set pooling for graph neural networks. arXiv preprint arXiv:2107.01410

  18. Di X, Yu P, Bu R et al (2020) Mutual information maximization in graph neural networks. In: 2020 International joint conference on neural networks (IJCNN), pp 1–7

  19. Nguyen DQ, Nguyen TD, Phung D (2022) Universal graph transformer self-attention networks. In: Companion proceedings of the web conference, pp 193–196

  20. Wang YG, Li M, Ma Z, Montúfar G, Zhuang X, Fan Y (2019) Haar graph pooling. In: International conference on machine learning

  21. Welling M, Kipf TN (2017) Semi-supervised classification with graph convolutional networks. In: International conference on learning representations (ICLR)

  22. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: Advances in neural information processing systems, pp 1024–1034

  23. Veličković P, Cucurull G, Casanova A et al (2017) Graph attention networks. arXiv preprint arXiv:1710.10903

  24. Huang K, Wang YG, Li M, Lio P (2024) How universal polynomial bases enhance spectral graph neural networks: Heterophily, over-smoothing, and over-squashing. In: Forty-first international conference on machine learning

  25. Zheng X, Liu Y, Pan S et al (2022) Graph neural networks for graphs with heterophily: A survey. arXiv preprint arXiv:2202.07082

  26. Xu K, Li C, Tian Y et al (2018) Representation learning on graphs with jumping knowledge networks. In: International conference on machine learning, pp 5453–5462

  27. Chen M et al (2020) Simple and deep graph convolutional networks. In: International conference on machine learning. PMLR

  28. Liu M, Wang Z, Ji S (2021) Non-local graph neural networks. IEEE Trans Pattern Anal Mach Intell 44(12):10270–10276

    Article  MATH  Google Scholar 

  29. Abu-El-Haija S, Perozzi B, Kapoor A et al (2019) Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In: International conference on machine learning, pp 21–29. PMLR

  30. Pei H et al (2020) Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287

  31. Zhu J et al (2020) Beyond homophily in graph neural networks: Current limitations and effective designs. In: Advances in neural information processing systems, pp 7793–7804

  32. Bruna J, Zaremba W, Szlam A et al (2013) Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203

  33. Xu B, Shen H, Cao Q et al (2019) Graph wavelet neural network. arXiv preprint arXiv:1904.07785

  34. Wang X, Zhang M (2022) How powerful are spectral graph neural networks. In: International conference on machine learning, pp 23341–23362. PMLR

  35. Niepert M, Ahmed M, Kutzkov K (2016) Learning convolutional neural networks for graphs. In: International conference on machine learning, pp 2014–2023. PMLR

  36. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in neural information processing systems, vol 29

  37. Gao H, Liu Y, Ji S (2021) Topology-aware graph pooling networks. IEEE Trans Pattern Anal Mach Intell 43(12):4512–4518

    Article  MATH  Google Scholar 

  38. Jin W, Derr T, Wang Y, Ma Y, Liu Z, Tang J (2021) Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM international conference on web search and data mining, pp 148–156

  39. Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp. 855–864

  40. Hou Y, Zhang J, Cheng J et al (2019) Measuring and improving the use of graph information in graph neural networks. In: International conference on learning representations

  41. Xie Y, Li S, Yang C et al (2020) When do gnns work: Understanding and improving neighborhood aggregation. In: IJCAI, pp 1303–1309

  42. Alon U, Yahav E (2020) On the bottleneck of graph neural networks and its practical implications. arXiv preprint arXiv:2006.05205

  43. Li J, Zheng R, Feng H, Li M, Zhuang X (2024) Permutation equivariant graph framelets for heterophilous graph learning. IEEE Trans Neural Netw Learn Syst 1–15

  44. Huang C, Li M, Cao F, Fujita H, Li Z, Wu X (2023) Are graph convolutional networks with random weights feasible? IEEE Trans Pattern Anal Mach Intell 45(3):2751–2768

    Article  MATH  Google Scholar 

  45. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi-Rad T (2008) Collective classification in network data. AI Mag 29(3):93–93

    Google Scholar 

  46. Namata G, London B, Getoor L, Huang B (2012) Query-driven active surveying for collective classification. In: International workshop on mining and learning with graphs

  47. Tang J, Sun J, Wang C, Yang Z (2009) Social influence analysis in large-scale networks. In: Proceedings of the 15th ACM SIGKDD International conference on knowledge discovery and data mining, pp 807–816. ACM

  48. Lim D, Hohne F, Li X, Huang SL, Gupta V, Bhalerao O, Lim SN (2021) Large scale learning on non-homophilous graphs: New benchmarks and strong simple methods. Adv Neural Inf Process Syst 34:20887–20902

    Google Scholar 

  49. He D, Liang C, Liu H et al (2022) Block modeling-guided graph convolutional neural networks. Proc the AAAI Conf Artif Intell 36:4022–4029

    MATH  Google Scholar 

  50. Li G, Mueller M, Qian G et al (2021) Deepgcns: Making gcns go as deep as cnns. IEEE Trans Pattern Anal Mach Intell 1–1

  51. Topping J et al (2021) Understanding over-squashing and bottlenecks on graphs via curvature. arXiv preprint arXiv:2111.14522

  52. Zhu J, Rossi RA, Rao A, Mai T, Lipka N, Ahmed NK, Koutra D (2021) Graph neural networks with heterophily. Proc AAAI Conf Artif Intell 35:11168–11176

    Google Scholar 

  53. Kingma DP, Ba J (2014) Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980

  54. Maaten L, Hinton G (2008) Visualizing data using t-sne. J Mach Learn Res 9:2579–2605

    MATH  Google Scholar 

Download references

Acknowledgements

This research was supported by National Natural Science Foundation of China under Grant 62172184 and Science and Technology Development Plan of Jilin Province of China under Grant 20200401077GX.

Author information

Authors and Affiliations

Authors

Contributions

Huan Xu conceived and designed the research, conducted experiments, and authored the paper, followed by iterative revisions of the manuscript. Yan Gao assisted in manuscript review and proofreading. Quanle Liu contributed to graphical content suggestions. Mei Bie participated in the review of reference materials. Professor Xiangjiu Che acted as the corresponding author, conducting manuscript checks, providing valuable insights, and overseeing communication with the journal, including correspondence and responses.

Corresponding author

Correspondence to Xiangjiu Che.

Ethics declarations

Conflicts of Interest

The authors declare that they have no conflict of interest.

Ethical and Informed Consent for Data Used

Ethical and informed consent for data used.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, H., Gao, Y., Liu, Q. et al. A multi-relational neighbors constructed graph neural network for heterophily graph learning. Appl Intell 55, 13 (2025). https://doi.org/10.1007/s10489-024-06056-y

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10489-024-06056-y

Keywords