Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

NEAR: Neighborhood Edge AggregatoR for Graph Classification

Published: 22 April 2022 Publication History

Abstract

Learning graph-structured data with graph neural networks (GNNs) has been recently emerging as an important field because of its wide applicability in bioinformatics, chemoinformatics, social network analysis, and data mining. Recent GNN algorithms are based on neural message passing, which enables GNNs to integrate local structures and node features recursively. However, past GNN algorithms based on 1-hop neighborhood neural message passing are exposed to a risk of loss of information on local structures and relationships. In this article, we propose Neighborhood Edge AggregatoR (NEAR), a framework that aggregates relations between the nodes in the neighborhood via edges. NEAR, which can be orthogonally combined with Graph Isomorphism Network (GIN), gives integrated information that describes which nodes in the neighborhood are connected. Therefore, NEAR can reflect additional information of a local structure of each node beyond the nodes themselves in 1-hop neighborhood. Experimental results on multiple graph classification tasks show that our algorithm makes a good improvement over other existing 1-hop based GNN-based algorithms.

References

[1]
Karsten M. Borgwardt and Hans-Peter Kriegel. 2005. Shortest-path kernels on graphs. In Proceedings of the 5th IEEE International Conference on Data Mining (ICDM’05). IEEE Computer Society, 74–81. DOI:
[2]
JinYi Cai, Martin Fürer, and Neil Immerman. 1992. An optimal lower bound on the number of variables for graph identifications. Combinatorica 12, 4 (1992), 389–410. DOI:
[3]
Nicola De Cao and Thomas Kipf. 2018. MolGAN: An implicit generative model for small molecular graphs. Retrieved from http://arxiv.org/abs/1805.11973.
[4]
Connor W. Coley, Wengong Jin, Luke Rogers, Timothy F. Jamison, Tommi S. Jaakkola, William H. Green, Regina Barzilay, and Klavs F. Jensen. 2019. A graph-convolutional neural network model for the prediction of chemical reactivity. Chem. Sci. 10, 2 (2019), 370–377.
[5]
Michaël Defferrard, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional neural networks on graphs with fast localized spectral filtering. In Proceedings of the Annual Conference on Neural Information Processing Systems. 3837–3845.
[6]
B. Douglas. 2011. The Weisfeiler-Lehman Method and Graph Isomorphism Testing. Retrieved from https://arxiv.org/abs/1101.5211.
[7]
Simon S. Du, Kangcheng Hou, Ruslan Salakhutdinov, Barnabás Póczos, Ruosong Wang, and Keyulu Xu. 2019. Graph neural tangent kernel: Fusing graph neural networks with graph kernels. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, December 8-14, 2019, Vancouver, BC, Canada. 5724–5734. Retrieved from https://proceedings.neurips.cc/paper/2019/hash/663fd3c5144fd10bd5ca6611a9a5b92d-Abstract.html.
[8]
Alex Fout, Jonathon Byrd, Basir Shariat, and Asa Ben-Hur. 2017. Protein interface prediction using graph convolutional networks. In Proceedings of the Annual Conference on Neural Information Processing Systems.6530–6539. Retrieved from http://papers.nips.cc/paper/7231-protein-interface-prediction-using-graph-convolutional-networks.
[9]
Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, and George E. Dahl. 2017. Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning. 1263–1272.
[10]
M. Gori, G. Monfardini, and F. Scarselli. 2005. A new model for learning in graph domains. In Proceedings of the IEEE International Joint Conference on Neural Networks. 729–734.
[11]
William L. Hamilton, Payal Bajaj, Marinka Zitnik, Dan Jurafsky, and Jure Leskovec. 2018. Embedding logical queries on knowledge graphs. In Proceedings of the Annual Conference on Neural Information Processing Systems Advances in Neural Information Processing Systems. 2030–2041. Retrieved from http://papers.nips.cc/paper/7473-embedding-logical-queries-on-knowledge-graphs.
[12]
William L. Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. In Proceedings of the Annual Conference on Neural Information Processing Systems. 1025–1035.
[13]
Sergey Ioffe and Christian Szegedy. 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the 32nd International Conference on Machine Learning. 448–456.
[14]
Sergey Ivanov and Evgeny Burnaev. 2018. Anonymous walk embeddings. In Proceedings of the 35th International Conference on Machine Learning. 2191–2200.
[15]
Md. Rezaul Karim, Michael Cochez, Joao Bosco Jares, Mamtaz Uddin, Oya Deniz Beyan, and Stefan Decker. 2019. Drug-drug interaction prediction based on knowledge graph embeddings and convolutional-LSTM network. In Proceedings of the 10th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics. ACM, 113–123. DOI:
[16]
Seyed Mehran Kazemi and David Poole. 2018. SimplE embedding for link prediction in knowledge graphs. In Proceedings of the Annual Conference on Neural Information Processing Systems. 4289–4300. Retrieved from http://papers.nips.cc/paper/7682-simple-embedding-for-link-prediction-in-knowledge-graphs.
[17]
Kristian Kersting, Nils M. Kriege, Christopher Morris, Petra Mutzel, and Marion Neumann. 2016. Benchmark Data Sets for Graph Kernels. Retrieved from http://graphkernels.cs.tu-dortmund.de.
[18]
Diederik P. Kingma and Jimmy Ba. 2015. Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations. Retrieved from http://arxiv.org/abs/1412.6980.
[19]
Thomas N. Kipf and Max Welling. 2016. Variational graph auto-encoders. Retrieved from http://arxiv.org/abs/1611.07308.
[20]
Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In Proceedings of the 5th International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=SJU4ayYgl.
[21]
Boris Knyazev, Graham W. Taylor, and Mohamed Amer. 2019. Understanding attention and generalization in graph neural networks. In Proceedings of the Annual Conference on Advances in Neural Information Processing Systems. 4204–4214.
[22]
Ryosuke Kojima, Shoichi Ishida, Masateru Ohta, Hiroaki Iwata, Teruki Honma, and Yasushi Okuno. 2020. kGCN: A graph-based deep learning framework for chemical structures. Journal of Cheminformatics 12, 1 (2020), 1–10. Retrieved from.
[23]
Junhyun Lee, Inyeop Lee, and Jaewoo Kang. 2019. Self-attention graph pooling. In Proceedings of the 36th International Conference on Machine Learning (Proceedings of Machine Learning Research), Kamalika Chaudhuri and Ruslan Salakhutdinov (Eds.), Vol. 97. PMLR, 3734–3743. Retrieved from http://proceedings.mlr.press/v97/lee19c.html.
[24]
Jianxin Ma, Peng Cui, Kun Kuang, Xin Wang, and Wenwu Zhu. 2019. Disentangled graph convolutional networks. In Proceedings of the 36th International Conference on Machine Learning (Proceedings of Machine Learning Research), Kamalika Chaudhuri and Ruslan Salakhutdinov (Eds.), Vol. 97. PMLR, 4212–4221.
[25]
Federico Monti, Davide Boscaini, Jonathan Masci, Emanuele Rodolà, Jan Svoboda, and Michael M. Bronstein. 2017. Geometric deep learning on graphs and manifolds using mixture model CNNs. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE Computer Society, 5425–5434. DOI:
[26]
Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric Lenssen, Gaurav Rattan, and Martin Grohe. 2019. Weisfeiler and Lehman go neural: Higher-order graph neural networks. In Proceedings of the 33rd AAAI Conference on Artificial Intelligence, the 31st Innovative Applications of Artificial Intelligence Conference, the 9th AAAI Symposium on Educational Advances in Artificial Intelligence.4602–4609.
[27]
Mathias Niepert, Mohamed Ahmed, and Konstantin Kutzkov. 2016. Learning convolutional neural networks for graphs. In Proceedings of the 33rd International Conference on Machine Learning. 2014–2023.
[28]
Giannis Nikolentzos and Michalis Vazirgiannis. 2020. Random walk graph neural networks. In Proceedings of the Annual Conference on Neural Information Processing Systems. Retrieved from https://proceedings.neurips.cc/paper/2020/hash/ba95d78a7c942571185308775a97a3a0-Abstract.html.
[29]
Chanhee Park, Jinuk Park, and Sanghyun Park. 2020. AGCN: Attention-based graph convolutional networks for drug-drug interaction extraction. Exp. Syst. Applic. 159 (2020), 113538. DOI:
[30]
Ekagra Ranjan, Soumya Sanyal, and Partha P. Talukdar. 2020. ASAP: Adaptive structure aware pooling for learning hierarchical graph representations. In Proceedings of the 34th AAAI Conference on Artificial Intelligence, the 32nd Innovative Applications of Artificial Intelligence Conference, the 10th AAAI Symposium on Educational Advances in Artificial Intelligence. AAAI Press, 5470–5477. Retrieved from https://aaai.org/ojs/index.php/AAAI/article/view/5997.
[31]
Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2009. The graph neural network model. IEEE Trans. Neural Netw. 20, 1 (2009), 61–80. DOI:
[32]
Nino Shervashidze, Pascal Schweitzer, Erik Jan van Leeuwen, Kurt Mehlhorn, and Karsten M. Borgwardt. 2011. Weisfeiler-Lehman graph kernels. J. Mach. Learn. Res. 12 (2011), 2539–2561. Retrieved from http://dl.acm.org/citation.cfm?id=2078187.
[33]
Nino Shervashidze, S. V. N. Vishwanathan, Tobias Petri, Kurt Mehlhorn, and Karsten M. Borgwardt. 2009. Efficient graphlet kernels for large graph comparison. In Proceedings of the 12th International Conference on Artificial Intelligence and Statistics. 488–495. Retrieved from http://proceedings.mlr.press/v5/shervashidze09a.html.
[34]
Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15 (2014), 1929–1958.
[35]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph attention networks. In Proceedings of the 6th International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=rJXMpikCZ.
[36]
Hongwei Wang, Miao Zhao, Xing Xie, Wenjie Li, and Minyi Guo. 2019. Knowledge graph convolutional networks for recommender systems. In Proceedings of the World Wide Web Conference. ACM, 3307–3313. DOI:
[37]
Boris Weisfeiler and Andrei A. Lehman. 1968. A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Technicheskaya Informatsia 2, 9 (1968), 12–16.
[38]
Zhenqin Wu, Bharath Ramsundar, Evan N. Feinberg, Joseph Gomes, Caleb Geniesse, Aneesh S. Pappu, Karl Leswing, and Vijay Pande. 2018. MoleculeNet: A benchmark for molecular machine learning. Chem. Sci. 9, 2 (2018), 513–530.
[39]
Tian Xie and Jeffrey C. Grossman. 2018. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett. 120, 14 (2018), 145301.
[40]
Zhang Xinyi and Lihui Chen. 2019. Capsule graph neural network. In Proceedings of the 7th International Conference on Learning Representations. OpenReview.net. Retrieved from https://openreview.net/forum?id=Byl8BnRcYm.
[41]
Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How powerful are graph neural networks? In Proceedings of the International Conference on Learning Representations.
[42]
Keyulu Xu, Chengtao Li, Yonglong Tian, Tomohiro Sonobe, Ken-ichi Kawarabayashi, and Stefanie Jegelka. 2018. Representation learning on graphs with jumping knowledge networks. In Proceedings of the 35th International Conference on Machine Learning. 5449–5458.
[43]
Zhitao Ying, Jiaxuan You, Christopher Morris, Xiang Ren, William L. Hamilton, and Jure Leskovec. 2018. Hierarchical graph representation learning with differentiable pooling. In Proceedings of the Annual Conference on Neural Information Processing Systems.4805–4815.
[44]
Jiaxuan You, Bowen Liu, Zhitao Ying, Vijay S. Pande, and Jure Leskovec. 2018. Graph convolutional policy network for goal-directed molecular graph generation. In Proceedings of the Annual Conference on Advances in Neural Information Processing Systems. 6412–6422. Retrieved from http://papers.nips.cc/paper/7877-graph-convolutional-policy-network-for-goal-directed-molecular-graph-generation.
[45]
Muhan Zhang, Zhicheng Cui, Marion Neumann, and Yixin Chen. 2018. An end-to-end deep learning architecture for graph classification. In Proceedings of the 32nd AAAI Conference on Artificial Intelligence, the 30th innovative Applications of Artificial Intelligence, and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence. 4438–4445.
[46]
Zhen Zhang, Mianzhi Wang, Yijian Xiang, Yan Huang, and Arye Nehorai. 2018. RetGK: Graph kernels based on return probabilities of random walks. In Proceedings of the Annual Conference on Advances in Neural Information Processing Systems.3968–3978.

Cited By

View all
  • (2024)GCNXG: Detecting Fraudulent Activities in Financial Networks: A Graph Analytics and Machine Learning FusionRenewable Energy, Green Computing, and Sustainable Development10.1007/978-3-031-58607-1_2(17-32)Online publication date: 18-Apr-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Intelligent Systems and Technology
ACM Transactions on Intelligent Systems and Technology  Volume 13, Issue 3
June 2022
415 pages
ISSN:2157-6904
EISSN:2157-6912
DOI:10.1145/3508465
  • Editor:
  • Huan Liu
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 April 2022
Accepted: 01 December 2021
Revised: 01 June 2021
Received: 01 July 2020
Published in TIST Volume 13, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Graph classification
  2. graph neural network
  3. 1-dimensional Weisfeiler-Lehman test
  4. deep neural network

Qualifiers

  • Research-article
  • Refereed

Funding Sources

  • Basic Science Research Program through the National Research Foundation of Korea
  • Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIP)
  • Artificial Intelligence Graduate School Program (POSTECH)
  • ITRC (Information Technology Research Center)

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)79
  • Downloads (Last 6 weeks)2
Reflects downloads up to 01 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)GCNXG: Detecting Fraudulent Activities in Financial Networks: A Graph Analytics and Machine Learning FusionRenewable Energy, Green Computing, and Sustainable Development10.1007/978-3-031-58607-1_2(17-32)Online publication date: 18-Apr-2024

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media