Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Simplicial Complex Neural Networks

Published: 13 October 2023 Publication History

Abstract

Graph-structured data, where nodes exhibit either pair-wise or high-order relations, are ubiquitous and essential in graph learning. Despite the great achievement made by existing graph learning models, these models use the direct information (edges or hyperedges) from graphs and do not adopt the underlying indirect information (hidden pair-wise or high-order relations). To address this issue, in this paper, we propose a general framework named Simplicial Complex Neural (SCN) network, in which we construct a simplicial complex based on the direct and indirect graph information from a graph so that all information can be employed in the complex network learning. Specifically, we learn representations of simplices by aggregating and integrating information from all the simplices together via layer-by-layer simplicial complex propagation. In consequence, the representations of nodes, edges, and other high-order simplices are obtained simultaneously and can be used for learning purposes. By making use of block matrix properties, we derive the theoretical bound of the simplicial complex filter learnt by the propagation and establish the generalization error bound of the proposed simplicial complex network. We perform extensive experiments on node (0-simplex), edge (1-simplex), and triangle (2-simplex) classifications, and promising results demonstrate the performance of the proposed method is better than that of existing graph and hypergraph network approaches.

References

[1]
F. Xia et al., “Graph learning: A survey,” IEEE Trans. Artif. Intell., vol. 2, no. 2, pp. 109–127, Apr. 2021.
[2]
Z. Zhang, P. Cui, and W. Zhu, “Deep learning on graphs: A survey,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 1, pp. 249–270, Jan. 2022.
[3]
H. Gao, Y. Liu, and S. Ji, “Topology-aware graph pooling networks,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 43, no. 12, pp. 4512–4518, Dec. 2021.
[4]
H. Zhu and P. Koniusz, “Simple spectral graph convolution,” in Proc. Int. Conf. Learn. Representations, 2021.
[5]
Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and P. S. Yu, “A comprehensive survey on graph neural networks,” IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 1, pp. 4–24, Jan. 2021.
[6]
H. Yuan, H. Yu, S. Gui, and S. Ji, “Explainability in graph neural networks: A taxonomic survey,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 5, pp. 5782–5799, May 2023.
[7]
T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. Int. Conf. Learn. Representations, 2017.
[8]
H. Wu and M. K. Ng, “Hypergraph convolution on nodes-hyperedges network for semi-supervised node classification,” ACM Trans. Knowl. Discov. Data, vol. 16, no. 4, pp. 1–19, 2022.
[9]
F. M. Bianchi, D. Grattarola, L. Livi, and C. Alippi, “Graph neural networks with convolutional ARMA filters,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 44, no. 7, pp. 3496–3507, Jul. 2022.
[10]
M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Proc. Neural Inf. Process. Syst., 2016, pp. 3844–3852.
[11]
D. Zhou, J. Huang, and B. Schölkopf, “Learning with hypergraphs: Clustering, classification, and embedding,” Neural Inf. Process. Syst., vol. 19, pp. 1601–1608, 2006.
[12]
Y. Feng, H. You, Z. Zhang, R. Ji, and Y. Gao, “Hypergraph neural networks,” in Proc. AAAI Conf. Artif. Intell., 2019, pp. 3558–3565.
[13]
N. Yadati, M. Nimishakavi, P. Yadav, V. Nitin, A. Louis, and P. Talukdar, “HyperGCN: A new method of training graph convolutional networks on hypergraphs,” in Proc. Neural Inf. Process. Syst., 2019, pp. 1511–1522.
[14]
Y. Gao, Z. Zhang, H. Lin, X. Zhao, S. Du, and C. Zou, “Hypergraph learning: Methods and practices,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 44, no. 5, pp. 2548–2566, May 2022.
[15]
Y. Dong, W. Sawin, and Y. Bengio, “HNHN: Hypergraph networks with hyperedge neurons,” in Proc. Graph Representations Beyond Workshop Int. Conf. Mach. Learn., 2020.
[16]
H. Wu, Y. Yan, and M. K.-P. Ng, “Hypergraph collaborative network on vertices and hyperedges,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 3, pp. 3245–3258, Mar. 2023.
[17]
J. Jonsson, Simplicial Complexes of Graphs. Berlin, Germany: Springer, 2007.
[18]
M. T. Schaub, A. R. Benson, P. Horn, G. Lippner, and A. Jadbabaie, “Random walks on simplicial complexes and the normalized Hodge 1-Laplacian,” SIAM Rev., vol. 62, no. 2, pp. 353–391, 2020.
[19]
M. E. Aktas and E. Akbas, “Hypergraph Laplacians in diffusion framework,” in Proc. Int. Conf. Complex Netw. Appl., Springer, 2021, pp. 277–288.
[20]
Y. LeCun et al., “Convolutional networks for images, speech, and time series,” Handbook Brain Theory Neural Netw., vol. 3361, no. 10, 1995, Art. no.
[21]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in Proc. Int. Conf. Learn. Representations, 2018.
[22]
R. Li, S. Wang, F. Zhu, and J. Huang, “Adaptive graph convolutional neural networks,” in Proc. AAAI Conf. Artif. Intell., 2018, pp. 3546–3553.
[23]
M. Chen, Z. Wei, Z. Huang, B. Ding, and Y. Li, “Simple and deep graph convolutional networks,” in Proc. Int. Conf. Mach. Learn., PMLR, 2020, pp. 1725–1735.
[24]
K. Guo, K. Zhou, X. Hu, Y. Li, Y. Chang, and X. Wang, “Orthogonal graph neural networks,” in Proc. AAAI Conf. Artif. Intell., 2022, pp. 3996–4004.
[25]
H. Wu, Y. Yan, Y. Ye, M. K. Ng, and Q. Wu, “Geometric knowledge embedding for unsupervised domain adaptation,” Knowl.-Based Syst., vol. 191, 2020, Art. no.
[26]
S. Zorzi, S. Bazrafkan, S. Habenschuss, and F. Fraundorfer, “Polyworld: Polygonal building extraction with graph neural networks in satellite images,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2022, pp. 1848–1857.
[27]
Y. Dong, Q. Liu, B. Du, and L. Zhang, “Weighted feature fusion of convolutional neural network and graph attention network for hyperspectral image classification,” IEEE Trans. Image Process., vol. 31, pp. 1559–1572, 2022.
[28]
Q. Li, Z. Han, and X.-M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” in Proc. 32nd AAAI Conf. Artif. Intell., 2018, pp. 3538–3545.
[29]
H. Wu, J. Long, N. Li, D. Yu, and M. K. Ng, “Adversarial auto-encoder domain adaptation for cold-start recommendation with positive and negative hypergraphs,” ACM Trans. Inf. Syst., vol. 41, no. 2, pp. 1–25, 2023.
[30]
V. La Gatta, V. Moscato, M. Pennone, M. Postiglione, and G. Sperlí, “Music recommendation via hypergraph embedding,” IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 10, pp. 7887–7899, Oct. 2023.
[31]
Y. Yang, C. Huang, L. Xia, Y. Liang, Y. Yu, and C. Li, “Multi-behavior hypergraph-enhanced transformer for sequential recommendation,” in Proc. 28th ACM SIGKDD Conf. Knowl. Discov. Data Mining, 2022, pp. 2263–2274.
[32]
H. Wu et al., “Cold-start next-item recommendation by user-item matching and auto-encoders,” IEEE Trans. Serv. Comput., vol. 16, no. 4, pp. 2477–2489, Jul./Aug. 2023.
[33]
H. Shi et al., “Hypergraph-induced convolutional networks for visual classification,” IEEE Trans. Neural Netw. Learn. Syst., vol. 30, no. 10, pp. 2963–2972, Oct. 2019.
[34]
D. Di, J. Zhang, F. Lei, Q. Tian, and Y. Gao, “Big-hypergraph factorization neural network for survival prediction from whole slide image,” IEEE Trans. Image Process., vol. 31, pp. 1149–1160, 2022.
[35]
S. Bai, F. Zhang, and P. H. Torr, “Hypergraph convolution and hypergraph attention,” Pattern Recognit., vol. 110, 2021, Art. no.
[36]
Y. Gao, Y. Feng, S. Ji, and R. Ji, “HGNN+: General hypergraph neural networks,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 3, pp. 3181–3199, Mar. 2023.
[37]
J. Huang and J. Yang, “UniGNN: A unified framework for graph and hypergraph neural networks,” in Proc. 30th Int. Joint Conf. Artif. Intell., 2021, pp. 2563–2569.
[38]
C. Yang, R. Wang, S. Yao, and T. Abdelzaher, “Semi-supervised hypergraph node classification on hypergraph line expansion,” in Proc. 31st ACM Int. Conf. Inf. Knowl. Manage., 2022, pp. 2352–2361.
[39]
M. E. Aktas, E. Akbas, and A. E. Fatmaoui, “Persistence homology of networks: Methods and applications,” Appl. Netw. Sci., vol. 4, no. 1, pp. 1–28, 2019.
[40]
M. Hardt, B. Recht, and Y. Singer, “Train faster, generalize better: Stability of stochastic gradient descent,” in Proc. Int. Conf. Mach. Learn., PMLR, 2016, pp. 1225–1234.
[41]
S. Verma and Z.-L. Zhang, “Stability and generalization of graph convolutional neural networks,” in Proc. 25th ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2019, pp. 1539–1548.
[42]
M. K. Ng, H. Wu, and A. Yip, “Stability and generalization of hypergraph collaborative networks,” 2023, arXiv:2308.02347.
[43]
O. Bousquet and A. Elisseeff, “Stability and generalization,” J. Mach. Learn. Res., vol. 2, pp. 499–526, 2002.
[44]
I. Bhattacharya and L. Getoor, “Collective entity resolution in relational data,” ACM Trans. Knowl. Discov. Data, vol. 1, no. 1, pp. 5–es, 2007.
[45]
P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad, “Collective classification in network data,” AI Mag., vol. 29, no. 3, pp. 93–93, 2008.
[46]
J. Tang, J. Zhang, L. Yao, J. Li, L. Zhang, and Z. Su, “ArnetMiner: Extraction and mining of academic social networks,” in Proc. 14th ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2008, pp. 990–998.
[47]
N. A. Crossley et al., “Cognitive relevance of the community structure of the human brain functional coactivation network,” in Proc. Nat. Acad. Sci. USA, vol. 110, no. 28, pp. 11 583–11 588, 2013.
[48]
I. Amburg, N. Veldt, and A. Benson, “Clustering in graphs and hypergraphs with categorical edge labels,” in Proc. Web Conf., 2020, pp. 706–717.
[49]
A. Sinha et al., “An overview of Microsoft Academic Service (MAS) and applications,” in Proc. 24th Int. Conf. World Wide Web, 2015, pp. 243–246.
[50]
C. Cortes and V. Vapnik, “Support-vector networks,” Mach. Learn., vol. 20, no. 3, pp. 273–297, 1995.
[51]
L. McInnes, J. Healy, N. Saul, and L. Grossberger, “UMAP: Uniform manifold approximation and projection,” J. Open Source Softw., vol. 3, no. 29, 2018, Art. no.
[52]
M. Liu, Y. Liu, K. Liang, S. Wang, S. Zhou, and X. Liu, “Deep temporal graph clustering,” 2023,.
[53]
J. Yang and J. Leskovec, “Defining and evaluating network communities based on ground-truth,” in Proc. ACM SIGKDD Workshop Mining Data Semantics, 2012, pp. 1–8.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

Publisher

IEEE Computer Society

United States

Publication History

Published: 13 October 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Oct 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media