Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/3666122.3668930guideproceedingsArticle/Chapter ViewAbstractPublication PagesnipsConference Proceedingsconference-collections
research-article

TopoSRL: topology preserving self-supervised simplicial representation learning

Published: 10 December 2023 Publication History

Abstract

In this paper, we introduce TopoSRL, a novel self-supervised learning (SSL) method for simplicial complexes to effectively capture higher-order interactions and preserve topology in the learned representations. TopoSRL addresses the limitations of existing graph-based SSL methods that typically concentrate on pairwise relationships, neglecting long-range dependencies crucial to capturing topological information. We propose a new simplicial augmentation technique that generates two views of the simplicial complex that enriches the representations while being efficient. Next, we propose a new simplicial contrastive loss function that contrasts the generated simplices to preserve local and global information present in the simplicial complexes. Extensive experimental results demonstrate the superior performance of TopoSRL compared to state-of-the-art graph SSL techniques and supervised simplicial neural models across various datasets corroborating the efficacy of TopoSRL in processing simplicial complex data in a self-supervised setting.

Supplementary Material

Additional material (3666122.3668930_supp.pdf)
Supplemental material.

References

[1]
Christopher Wei Jin Goh, Cristian Bodnar, and Pietro Liò. Simplicial Attention Networks. 2022. URL: https://arxiv.org/abs/2204.09455.
[2]
Stefania Ebli, Michaël Defferrard, and Gard Spreemann. "Simplicial neural networks". In: arXiv preprint arXiv:2010.03633 (2020).
[3]
Maosheng Yang, Elvin Isufi, and Geert Leus. "Simplicial convolutional neural networks". In: ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE. 2022, pp. 8847-8851.
[4]
Eric Bunch et al. "Simplicial 2-complex convolutional neural nets". In: arXiv preprint arXiv:2012.06010 (2020).
[5]
Cristian Bodnar et al. "Weisfeiler and lehman go topological: Message passing simplicial networks". In: International Conference on Machine Learning. PMLR. 2021, pp. 1026-1037.
[6]
T. Mitchell Roddenberry, Nicholas Glaze, and Santiago Segarra. "Principled Simplicial Neural Networks for Trajectory Prediction". In: Proceedings of the 38th International Conference on Machine Learning. Ed. by Marina Meila and Tong Zhang. Vol. 139. Proceedings of Machine Learning Research. PMLR, 2021, pp. 9020-9029. URL: https://proceedings.mlr.press/v139/roddenberry21a.html.
[7]
Yanqiao Zhu et al. "Graph contrastive learning with adaptive augmentation". In: Proceedings of the Web Conference 2021. 2021, pp. 2069-2080.
[8]
Yanqiao Zhu et al. "Deep Graph Contrastive Representation Learning". In: CoRR abs/2006.04131 (2020). arXiv: 2006.04131. URL: https://arxiv.org/abs/2006.04131.
[9]
Petar Velickovic et al. "Deep graph infomax." In: ICLR (Poster) 2.3 (2019), p. 4.
[10]
Kaveh Hassani and Amir Hosein Khasahmadi. "Contrastive multi-view representation learning on graphs". In: International conference on machine learning. PMLR. 2020, pp. 4116-4126.
[11]
Shantanu Thakoor et al. "Large-scale representation learning on graphs via bootstrapping". In: arXiv preprint arXiv:2102.06514 (2021).
[12]
Zekarias T. Kefato and Sarunas Girdzijauskas. "Self-supervised Graph Neural Networks without explicit negative sampling". In: CoRR abs/2103.14958 (2021). arXiv: 2103.14958. URL: https://arxiv.org/abs/2103.14958.
[13]
Hengrui Zhang et al. "From canonical correlation analysis to self-supervised graph neural networks". In: Thirty-Fifth Conference on Neural Information Processing Systems. 2021.
[14]
Zhenyu Hou et al. "Graphmae: Self-supervised masked graph autoencoders". In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2022, pp. 594-604.
[15]
Petar Veličković et al. Graph Attention Networks. 2018. arXiv: 1710.10903 [stat.ML].
[16]
R Devon Hjelm et al. Learning deep representations by mutual information estimation and maximization. 2019. arXiv: 1808.06670 [stat.ML].
[17]
Zhen Peng et al. "Graph Representation Learning via Graphical Mutual Information Maximization". In: Proceedings of The Web Conference 2020. WWW '20. Taipei, Taiwan, 2020, 259-270. ISBN: 9781450370233. URL: https://doi.org/10.1145/3366423.3380112.
[18]
Mohamed Ishmael Belghazi et al. "Mutual Information Neural Estimation". In: Proceedings of the 35th International Conference on Machine Learning. Ed. by Jennifer Dy and Andreas Krause. Vol. 80. Proceedings of Machine Learning Research. PMLR, 2018, pp. 531-540. URL: https://proceedings.mlr.press/v80/belghazi18a.html.
[19]
Ting Chen et al. "A Simple Framework for Contrastive Learning of Visual Representations". In: Proceedings of the 37th International Conference on Machine Learning. Ed. by Hal Daumé III and Aarti Singh. Vol. 119. Proceedings of Machine Learning Research. PMLR, 2020, pp. 1597-1607. URL: https://proceedings.mlr.press/v119/chen20j.html.
[20]
Yuning You et al. "Graph contrastive learning with augmentations". In: Advances in neural information processing systems 33 (2020), pp. 5812-5823.
[21]
Jean-Bastien Grill et al. "Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning". In: CoRR abs/2006.07733 (2020). arXiv: 2006. 07733. URL: https://arxiv.org/abs/2006.07733.
[22]
Fan-Yun Sun et al. "Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization". In: arXiv preprint arXiv:1908.01000 (2019).
[23]
Christopher Morris et al. "TUDataset: A collection of benchmark datasets for learning with graphs". In: ICML 2020 Workshop on Graph Representation Learning and Beyond (GRL+ 2020). 2020. arXiv: 2007.08663. URL: www.graphlearning.io.
[24]
Thomas N. Kipf and Max Welling. "Semi-Supervised Classification with Graph Convolutional Networks". In: International Conference on Learning Representations. 2017. URL: https://openreview.net/forum?id=SJU4ayYgl.
[25]
Keyulu Xu et al. "How powerful are graph neural networks?" In: arXiv preprint arXiv:1810.00826 (2018).
[26]
William L. Hamilton, Rex Ying, and Jure Leskovec. "Inductive Representation Learning on Large Graphs". In: CoRR abs/1706.02216 (2017). arXiv: 1706. 02216. URL: http://arxiv.org/abs/1706.02216.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
NIPS '23: Proceedings of the 37th International Conference on Neural Information Processing Systems
December 2023
80772 pages

Publisher

Curran Associates Inc.

Red Hook, NY, United States

Publication History

Published: 10 December 2023

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 19 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media