Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3511808.3557478acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article

Towards Self-supervised Learning on Graphs with Heterophily

Published: 17 October 2022 Publication History

Abstract

Recently emerged heterophilous graph neural networks have significantly reduced the reliance on the assumption of graph homophily where linked nodes have similar features and labels. These methods focus on a supervised setting that relies on labeling information heavily and presents the limitations on general graph downstream tasks. In this work, we propose a self-supervised representation learning paradigm on graphs with heterophily (namely HGRL) for improving the generalizability of node representations, where node representations are optimized without any label guidance. Inspired by the designs of existing heterophilous graph neural networks, HGRL learns the node representations by preserving the node original features and capturing informative distant neighbors. Such two properties are obtained through carefully designed pretext tasks that are optimized based on estimated high-order mutual information. Theoretical analysis interprets the connections between HGRL and existing advanced graph neural network designs. Extensive experiments on different downstream tasks demonstrate the effectiveness of the proposed framework.

References

[1]
Sami Abu-El-Haija, Bryan Perozzi, Amol Kapoor, Nazanin Alipourfard, Kristina Lerman, Hrayr Harutyunyan, Greg Ver Steeg, and Aram Galstyan. 2019. Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In international conference on machine learning. PMLR, 21--29.
[2]
Mohamed Ishmael Belghazi, Aristide Baratin, Sai Rajeshwar, Sherjil Ozair, Yoshua Bengio, Aaron Courville, and Devon Hjelm. 2018. Mutual Information Neural Estimation. In Proceedings of the 35th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 80). PMLR, 531--540.
[3]
Piotr Bielak, Tomasz Kajdanowicz, and Nitesh V. Chawla. 2021. Graph Barlow Twins: A self-supervised representation learning framework for graphs. arxiv: 2106.02466 [cs.LG]
[4]
Deyu Bo, Xiao Wang, Chuan Shi, Meiqi Zhu, Emiao Lu, and Peng Cui. 2020. Structural deep clustering network. In Proceedings of The Web Conference 2020. 1400--1410.
[5]
Ting Chen, Simon Kornblith, Mohammad Norouzi, and Geoffrey Hinton. 2020. A simple framework for contrastive learning of visual representations. In International conference on machine learning. PMLR, 1597--1607.
[6]
Eli Chien, Jianhao Peng, Pan Li, and Olgica Milenkovic. 2020. Adaptive universal generalized pagerank graph neural network. arXiv preprint arXiv:2006.07988 (2020).
[7]
Wei Dong, Junsheng Wu, Yi Luo, Zongyuan Ge, and Peng Wang. 2022. Node Representation Learning in Graph via Node-to-Neighbourhood Mutual Information Maximization. arXiv preprint arXiv:2203.12265 (2022).
[8]
Bahare Fatemi, Layla El Asri, and Seyed Mehran Kazemi. 2021. SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks. Advances in Neural Information Processing Systems, Vol. 34 (2021).
[9]
Luca Franceschi, Mathias Niepert, Massimiliano Pontil, and Xiao He. 2019. Learning discrete structures for graph neural networks. In International conference on machine learning. PMLR, 1972--1982.
[10]
Johannes Gasteiger, Aleksandar Bojchevski, and Stephan Günnemann. 2019. Predict then Propagate: Graph Neural Networks meet Personalized PageRank. In International Conference on Learning Representations (ICLR).
[11]
Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. 855--864.
[12]
William L. Hamilton, Rex Ying, and Jure Leskovec. 2017. Inductive Representation Learning on Large Graphs. In NIPS.
[13]
Kaveh Hassani and Amir Hosein Khasahmadi. 2020. Contrastive multi-view representation learning on graphs. In International Conference on Machine Learning. PMLR, 4116--4126.
[14]
Yuan He, Cheng Wang, and Changjun Jiang. 2017. Discovering canonical correlations between topical and topological information in document networks. IEEE Transactions on Knowledge and Data Engineering, Vol. 30, 3 (2017), 460--473.
[15]
Geoffrey E Hinton and Ruslan R Salakhutdinov. 2006. Reducing the dimensionality of data with neural networks. science, Vol. 313, 5786 (2006), 504--507.
[16]
Yifan Hou, Jian Zhang, James Cheng, Kaili Ma, Richard TB Ma, Hongzhi Chen, and Ming-Chang Yang. 2019. Measuring and improving the use of graph information in graph neural networks. In International Conference on Learning Representations.
[17]
Yang Hu, Haoxuan You, Zhecan Wang, Zhicheng Wang, Erjin Zhou, and Yue Gao. 2021. Graph-MLP: node classification without message passing in graph. arXiv preprint arXiv:2106.04051 (2021).
[18]
Yizhu Jiao, Yun Xiong, Jiawei Zhang, Yao Zhang, Tianqi Zhang, and Yangyong Zhu. 2020. Sub-graph contrast for scalable self-supervised graph representation learning. In 2020 IEEE International Conference on Data Mining (ICDM). IEEE, 222--231.
[19]
Di Jin, Zhizhi Yu, Cuiying Huo, Rui Wang, Xiao Wang, Dongxiao He, and Jiawei Han. 2021b. Universal Graph Convolutional Networks. Advances in Neural Information Processing Systems, Vol. 34 (2021).
[20]
Wei Jin, Tyler Derr, Yiqi Wang, Yao Ma, Zitao Liu, and Jiliang Tang. 2021a. Node similarity preserving graph convolutional networks. In Proceedings of the 14th ACM International Conference on Web Search and Data Mining. 148--156.
[21]
Baoyu Jing, Chanyoung Park, and Hanghang Tong. 2021. Hdmi: High-order deep multiplex infomax. In Proceedings of the Web Conference 2021. 2414--2424.
[22]
Masayuki Karasuyama and Masashi Sugiyama. 2012. Canonical dependency analysis based on squared-loss mutual information. Neural Networks, Vol. 34 (2012), 46--55.
[23]
Thomas N Kipf and Max Welling. 2016. Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016).
[24]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In International Conference on Learning Representations (ICLR).
[25]
Lingpeng Kong, Cyprien de Masson d'Autume, Wang Ling, Lei Yu, Zihang Dai, and Dani Yogatama. 2019. A mutual information maximization perspective of language representation learning. arXiv preprint arXiv:1910.08350 (2019).
[26]
Derek Lim, Xiuyu Li, Felix Hohne, and Ser-Nam Lim. 2021. New Benchmarks for Learning on Non-Homophilous Graphs. arXiv preprint arXiv:2104.01404 (2021).
[27]
Yixin Liu, Yu Zheng, Daokun Zhang, Hongxu Chen, Hao Peng, and Shirui Pan. 2022. Towards Unsupervised Deep Graph Structure Learning. In Proceedings of the ACM Web Conference 2022 (Virtual Event, Lyon, France) (WWW '22). Association for Computing Machinery, New York, NY, USA, 1392--1403.
[28]
Yao Ma, Xiaorui Liu, Neil Shah, and Jiliang Tang. 2021. Is Homophily a Necessity for Graph Neural Networks? arXiv preprint arXiv:2106.06134 (2021).
[29]
William McGill. 1954. Multivariate information transmission. Transactions of the IRE Professional Group on Information Theory, Vol. 4, 4 (1954), 93--111.
[30]
Sebastian Nowozin, Botond Cseke, and Ryota Tomioka. 2016. f-gan: Training generative neural samplers using variational divergence minimization. Advances in neural information processing systems, Vol. 29 (2016).
[31]
Hongbin Pei, Bingzhe Wei, Kevin Chen-Chuan Chang, Yu Lei, and Bo Yang. 2019. Geom-GCN: Geometric Graph Convolutional Networks. In International Conference on Learning Representations.
[32]
Hongbin Pei, Bingzhe Wei, Kevin Chen-Chuan Chang, Yu Lei, and Bo Yang. 2020. Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287 (2020).
[33]
Leonardo FR Ribeiro, Pedro HP Saverese, and Daniel R Figueiredo. 2017. struc2vec: Learning node representations from structural identity. In Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining. 385--394.
[34]
Benedek Rozemberczki, Carl Allen, and Rik Sarkar. 2021. Multi-scale attributed node embedding. Journal of Complex Networks, Vol. 9, 2 (2021), cnab014.
[35]
Fan-Yun Sun, Jordan Hoffmann, Vikas Verma, and Jian Tang. 2019. Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. arXiv preprint arXiv:1908.01000 (2019).
[36]
Jian Tang, Meng Qu, Mingzhe Wang, Ming Zhang, Jun Yan, and Qiaozhu Mei. 2015. LINE: Large-scale Information Network Embedding. In WWW. ACM.
[37]
Jie Tang, Jimeng Sun, Chi Wang, and Zi Yang. 2009. Social influence analysis in large-scale networks. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining. 807--816.
[38]
Shantanu Thakoor, Corentin Tallec, Mohammad Gheshlaghi Azar, Rémi Munos, Petar Velivc ković, and Michal Valko. 2021. Bootstrapped representation learning on graphs. In ICLR 2021 Workshop on Geometrical and Topological Representation Learning.
[39]
Yonglong Tian, Chen Sun, Ben Poole, Dilip Krishnan, Cordelia Schmid, and Phillip Isola. 2020. What makes for good views for contrastive learning? Advances in Neural Information Processing Systems, Vol. 33 (2020), 6827--6839.
[40]
Puja Trivedi, Ekdeep Singh Lubana, Yujun Yan, Yaoqing Yang, and Danai Koutra. 2022. Augmentations in Graph Contrastive Learning: Current Methodological Flaws & Towards Better Practices. In Proceedings of the ACM Web Conference 2022. 1538--1549.
[41]
Aaron Van den Oord, Yazhe Li, and Oriol Vinyals. 2018. Representation learning with contrastive predictive coding. arXiv e-prints (2018), arXiv--1807.
[42]
Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-SNE. Journal of machine learning research, Vol. 9, 11 (2008).
[43]
Petar Velivc ković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. International Conference on Learning Representations (2018). https://openreview.net/forum?id=rJXMpikCZ accepted as poster.
[44]
Petar Velickovic, William Fedus, William L Hamilton, Pietro Liò, Yoshua Bengio, and R Devon Hjelm. 2019. Deep Graph Infomax. ICLR (Poster), Vol. 2, 3 (2019), 4.
[45]
Chun Wang, Shirui Pan, Ruiqi Hu, Guodong Long, Jing Jiang, and Chengqi Zhang. 2019. Attributed graph clustering: A deep attentional embedding approach. arXiv preprint arXiv:1906.06532 (2019).
[46]
Ruijia Wang, Shuai Mou, Xiao Wang, Wanpeng Xiao, Qi Ju, Chuan Shi, and Xing Xie. 2021. Graph structure estimation neural networks. In Proceedings of the Web Conference 2021. 342--353.
[47]
Joseph Gomes Marinka Zitnik Percy Liang Vijay Pande Jure Leskovec Weihua Hu, Bowen Liu. 2020. Strategies for Pre-training Graph Neural Networks. In International Conference on Learning Representations. https://openreview.net/forum?id=HJlWWJSFDH
[48]
Lirong Wu, Haitao Lin, Cheng Tan, Zhangyang Gao, and Stan Z Li. 2021. Self-supervised learning on graphs: Contrastive, generative, or predictive. IEEE Transactions on Knowledge and Data Engineering (2021).
[49]
Yaochen Xie, Zhao Xu, Jingtun Zhang, Zhengyang Wang, and Shuiwang Ji. 2022. Self-supervised learning of graph neural networks: A unified review. IEEE Transactions on Pattern Analysis and Machine Intelligence (2022).
[50]
Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How Powerful are Graph Neural Networks?. In International Conference on Learning Representations. https://openreview.net/forum?id=ryGs6iA5Km
[51]
Keyulu Xu, Chengtao Li, Yonglong Tian, Tomohiro Sonobe, Ken-ichi Kawarabayashi, and Stefanie Jegelka. 2018. Representation learning on graphs with jumping knowledge networks. In International Conference on Machine Learning. PMLR, 5453--5462.
[52]
Yujun Yan, Milad Hashemi, Kevin Swersky, Yaoqing Yang, and Danai Koutra. 2021. Two sides of the same coin: Heterophily and oversmoothing in graph convolutional neural networks. arXiv preprint arXiv:2102.06462 (2021).
[53]
Han Yang, Kaili Ma, and James Cheng. 2020. Rethinking graph regularization for graph neural networks. arXiv preprint arXiv:2009.02027 (2020).
[54]
Liang Yang, Chuan Wang, Junhua Gu, Xiaochun Cao, and Bingxin Niu. 2021. Why do attributes propagate in graph convolutional neural networks. In Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI. 2--9.
[55]
Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, and Yang Shen. 2020. Graph contrastive learning with augmentations. Advances in Neural Information Processing Systems, Vol. 33 (2020), 5812--5823.
[56]
Hengrui Zhang, Qitian Wu, Junchi Yan, David Wipf, and Philip S Yu. 2021. From canonical correlation analysis to self-supervised graph neural networks. Advances in Neural Information Processing Systems, Vol. 34 (2021).
[57]
Xiaotong Zhang, Han Liu, Qimai Li, and Xiao-Ming Wu. 2019. Attributed graph clustering via adaptive graph convolution. arXiv preprint arXiv:1906.01210 (2019).
[58]
Xin Zheng, Yixin Liu, Shirui Pan, Miao Zhang, Di Jin, and Philip S Yu. 2022. Graph Neural Networks for Graphs with Heterophily: A Survey. arXiv preprint arXiv:2202.07082 (2022).
[59]
Zhiqiang Zhong, Guadalupe Gonzalez, Daniele Grattarola, and Jun Pang. 2022. Unsupervised Heterophilous Network Embedding via $ r $-Ego Network Discrimination. arXiv preprint arXiv:2203.10866 (2022).
[60]
Jiong Zhu, Junchen Jin, Donald Loveland, Michael T Schaub, and Danai Koutra. 2021a. On the Relationship between Heterophily and Robustness of Graph Neural Networks. arXiv preprint arXiv:2106.07767 (2021).
[61]
Jiong Zhu, Yujun Yan, Lingxiao Zhao, Mark Heimann, Leman Akoglu, and Danai Koutra. 2020b. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems, Vol. 33 (2020), 7793--7804.
[62]
Meiqi Zhu, Xiao Wang, Chuan Shi, Houye Ji, and Peng Cui. 2021b. Interpreting and unifying graph neural networks with an optimization framework. In Proceedings of the Web Conference 2021. 1215--1226.
[63]
Yanqiao Zhu, Yichen Xu, Feng Yu, Qiang Liu, Shu Wu, and Liang Wang. 2020a. Deep graph contrastive representation learning. arXiv preprint arXiv:2006.04131 (2020).
[64]
Yanqiao Zhu, Yichen Xu, Feng Yu, Qiang Liu, Shu Wu, and Liang Wang. 2021c. Graph contrastive learning with adaptive augmentation. In Proceedings of the Web Conference 2021. 2069--2080.

Cited By

View all
  • (2024)Graph contrastive learning under heterophily via graph filtersProceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence10.5555/3702676.3702860(3936-3955)Online publication date: 15-Jul-2024
  • (2024)Unifying Graph Neural Networks with a Generalized Optimization FrameworkACM Transactions on Information Systems10.1145/366085242:6(1-32)Online publication date: 19-Aug-2024
  • (2024)Graph Contrastive Learning via Interventional View GenerationProceedings of the ACM Web Conference 202410.1145/3589334.3645687(1024-1034)Online publication date: 13-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CIKM '22: Proceedings of the 31st ACM International Conference on Information & Knowledge Management
October 2022
5274 pages
ISBN:9781450392365
DOI:10.1145/3511808
  • General Chairs:
  • Mohammad Al Hasan,
  • Li Xiong
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 October 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. graph neural networks
  2. heterophilous graph
  3. representation learning
  4. self-supervised learning

Qualifiers

  • Research-article

Funding Sources

  • Natural Science Foundation of Jiangsu Province award
  • Key Research and Development Program of Jiangsu Province award
  • Open Research Projects of Zhejiang Lab award
  • National Natural Science Foundation of China award

Conference

CIKM '22
Sponsor:

Acceptance Rates

CIKM '22 Paper Acceptance Rate 621 of 2,257 submissions, 28%;
Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

Upcoming Conference

CIKM '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)194
  • Downloads (Last 6 weeks)13
Reflects downloads up to 03 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Graph contrastive learning under heterophily via graph filtersProceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence10.5555/3702676.3702860(3936-3955)Online publication date: 15-Jul-2024
  • (2024)Unifying Graph Neural Networks with a Generalized Optimization FrameworkACM Transactions on Information Systems10.1145/366085242:6(1-32)Online publication date: 19-Aug-2024
  • (2024)Graph Contrastive Learning via Interventional View GenerationProceedings of the ACM Web Conference 202410.1145/3589334.3645687(1024-1034)Online publication date: 13-May-2024
  • (2024)Graph Contrastive Learning Reimagined: Exploring UniversalityProceedings of the ACM on Web Conference 202410.1145/3589334.3645480(641-651)Online publication date: 13-May-2024
  • (2024)Unsupervised Graph Representation Learning Beyond Aggregated ViewIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.341857636:12(9504-9516)Online publication date: Dec-2024
  • (2024)BotSCL: Heterophily-Aware Social Bot Detection with Supervised Contrastive LearningPattern Recognition10.1007/978-3-031-78183-4_4(53-68)Online publication date: 4-Dec-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media