Multi-Scale Subgraph Contrastive Learning
Multi-Scale Subgraph Contrastive Learning
Yanbei Liu, Yu Zhao, Xiao Wang, Lei Geng, Zhitao Xiao
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 2215-2223.
https://doi.org/10.24963/ijcai.2023/246
Graph-level contrastive learning, aiming to learn the representations for each graph by contrasting two augmented graphs, has attracted considerable attention. Previous studies usually simply assume that a graph and its augmented graph as a positive pair, otherwise as a negative pair. However, it is well known that graph structure is always complex and multi-scale, which gives rise to a fundamental question: after graph augmentation, will the previous assumption still hold in reality? By an experimental analysis, we discover the semantic information of an augmented graph structure may be not consistent as original graph structure, and whether two augmented graphs are positive or negative pairs is highly related with the multi-scale structures. Based on this finding, we propose a multi-scale subgraph contrastive learning architecture which is able to characterize the fine-grained semantic information. Specifically, we generate global and local views at different scales based on subgraph sampling, and construct multiple contrastive relationships according to their semantic associations to provide richer self-supervised signals. Extensive experiments and parametric analyzes on eight graph classification real-world datasets well demonstrate the effectiveness of the proposed method.
Keywords:
Data Mining: DM: Mining graphs
Machine Learning: ML: Representation learning
Machine Learning: ML: Self-supervised Learning