Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-981-97-2303-4_12guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

SEGCN: Structural Enhancement Graph Clustering Network

Published: 29 May 2024 Publication History

Abstract

Deep graph clustering, which reveals the intrinsic structure and underlying relationship of graph node data, has become a highly concerning and challenging research task for graph-structured data. In recent years, existing graph clustering methods have gotten better performance without human guidance by combining auto-encoder and graph convolution networks. However, the existing methods exist problems: 1) the aggregation of noise information in the graph structure information leads to poor model learning effect; 2) the methods focus on learning the local structure information of the graph, and fail to learn the global structure information of the graph, resulting in incomplete learning of the graph. To overcome the shortcomings, we propose a Structural Enhancement Graph Clustering Network(SEGCN) to learn the graph topology information hidden in the dependencies of nodes attribute and the global structure information. Specifically, SEGCN adopts the method of calculating correlation to extract the graph topology information hidden in the dependencies of node attributes and design a dynamic fusion strategy to enrich the original graph structure information. Meanwhile, SEGCN enriches the embedding information extracted from the local structure by fusing information on the global structure based on the global structure dynamic fusion method. Moreover, to cluster the subsequent, SEGCN applies the pseudo-cluster labels learned by a dual self-supervised strategy to guide the learning of the node attribute representation and optimize the weight matrix of networks. Extensive experimental results on four benchmark datasets indicate that SEGCN achieves excellent performance on all datasets and accomplishes an accuracy of 71.56% and NMI of 44.50% on the CITE dataset.

References

[1]
Altman NS An introduction to kernel and nearest-neighbor nonparametric regression Am. Stat. 1992 46 3 175-185
[2]
Bo, D., Wang, X., Shi, C., Zhu, M., Lu, E., Cui, P.: Structural deep clustering network. In: Proceedings of the Web Conference 2020, pp. 1400–1410 (2020)
[3]
Guo, X., Gao, L., Liu, X., Yin, J.: Improved deep embedded clustering with local structure preservation. In: IJCAI, pp. 1753–1759 (2017)
[4]
Han, K., Vedaldi, A., Zisserman, A.: Learning to discover novel visual categories via deep transfer clustering. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 8401–8409 (2019)
[5]
Hinton GE and Salakhutdinov RR Reducing the dimensionality of data with neural networks Science 2006 313 5786 504-507
[6]
Hull JJ A database for handwritten text recognition research IEEE Trans. Pattern Anal. Mach. Intell. 1994 16 5 550-554
[7]
Kim, D., Oh, A.: How to find your friendly neighborhood: graph attention design with self-supervision. arXiv preprint arXiv:2204.04879 (2022)
[8]
Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016)
[9]
Kumar A, Rai P, and Daume H Co-regularized multi-view spectral clustering Adv. Neural Inf. Process. Syst. 2011 24 1-9
[10]
Li X, Wang S, and Li B Editorial for application-driven knowledge acquisition World Wide Web 2020 23 2649-2651
[11]
Van der Maaten, L., Hinton, G.: Visualizing data using t-sne. J. Mach. Learn. Res. 9(11) (2008)
[12]
Nie F, Cai G, Li J, and Li X Auto-weighted multi-view learning for image clustering and semi-supervised classification IEEE Trans. Image Process. 2017 27 3 1501-1511
[13]
Nikolentzos G, Dasoulas G, and Vazirgiannis M k-hop graph neural networks Neural Netw. 2020 130 195-205
[14]
Pan S, Hu R, Fung SF, Long G, Jiang J, and Zhang C Learning graph embedding with adversarial training methods IEEE Trans. Cybern. 2019 50 6 2475-2487
[15]
Park, J., Lee, M., Chang, H.J., Lee, K., Choi, J.Y.: Symmetric graph convolutional autoencoder for unsupervised graph representation learning. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 6519–6528 (2019)
[16]
Peng, Z., Liu, H., Jia, Y., Hou, J.: Attention-driven graph clustering network. In: Proceedings of the 29th ACM International Conference on Multimedia, pp. 935–943 (2021)
[17]
Stisen, A., et al.: Smart devices are different: assessing and mitigatingmobile sensing heterogeneities for activity recognition. In: Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems, pp. 127–140 (2015)
[18]
Sun, Q., et al.: Graph structure learning with variational information bottleneck. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4165–4174 (2022)
[19]
Tu, W., et al.: Deep fusion clustering network. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 9978–9987 (2021)
[20]
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)
[21]
Wang, C., Pan, S., Hu, R., Long, G., Jiang, J., Zhang, C.: Attributed graph clustering: a deep attentional embedding approach. arXiv preprint arXiv:1906.06532 (2019)
[22]
Wang, X., Zhu, M., Bo, D., Cui, P., Shi, C., Pei, J.: AM-GCN: adaptive multi-channel graph convolutional networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1243–1253 (2020)
[23]
Wu J, Lin Z, and Zha H Essential tensor learning for multi-view spectral clustering IEEE Trans. Image Process. 2019 28 12 5910-5922
[24]
Wu Z, Pan S, Chen F, Long G, Zhang C, and Philip SY A comprehensive survey on graph neural networks IEEE Trans. Neural Netw. Learn. Syst. 2020 32 1 4-24
[25]
Xie D, Gao Q, Deng S, Yang X, and Gao X Multiple graphs learning with a new weighted tensor nuclear norm Neural Netw. 2021 133 57-68
[26]
Xie, J., Girshick, R., Farhadi, A.: Unsupervised deep embedding for clustering analysis. In: International Conference on Machine Learning, pp. 478–487. PMLR (2016)
[27]
Zang Y, et al., et al. Bhattacharya A, et al., et al. GISDCN: a graph-based interpolation sequential recommender with deformable convolutional network International Conference on Database Systems for Advanced Applications 2022 Heidelberg Springer 289-297
[28]
Zhang, J., et al.: Self-supervised convolutional subspace clustering network. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 5473–5482 (2019)
[29]
Zhang Y, Li B, Gao H, Ji Y, Yang H, and Wang M Wang X, Zhang R, Lee YK, Sun L, and Moon YS Fine-grained evaluation of knowledge graph embedding models in downstream tasks APWeb-WAIM 2020 2020 Heidelberg Springer 242-256
[30]
Zhu, Y., Xu, W., Zhang, J., Liu, Q., Wu, S., Wang, L.: Deep graph structure learning for robust representations: a survey. arXiv preprint arXiv:2103.03036 (2021)

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
Web and Big Data: 7th International Joint Conference, APWeb-WAIM 2023, Wuhan, China, October 6–8, 2023, Proceedings, Part I
Oct 2023
532 pages
ISBN:978-981-97-2302-7
DOI:10.1007/978-981-97-2303-4
  • Editors:
  • Xiangyu Song,
  • Ruyi Feng,
  • Yunliang Chen,
  • Jianxin Li,
  • Geyong Min

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 29 May 2024

Author Tags

  1. graph clustering
  2. structural enhancement
  3. global structure
  4. convolutional networks
  5. self-supervised

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media