Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
survey
Open access

Computing Graph Neural Networks: A Survey from Algorithms to Accelerators

Published: 08 October 2021 Publication History

Abstract

Graph Neural Networks (GNNs) have exploded onto the machine learning scene in recent years owing to their capability to model and learn from graph-structured data. Such an ability has strong implications in a wide variety of fields whose data are inherently relational, for which conventional neural networks do not perform well. Indeed, as recent reviews can attest, research in the area of GNNs has grown rapidly and has lead to the development of a variety of GNN algorithm variants as well as to the exploration of ground-breaking applications in chemistry, neurology, electronics, or communication networks, among others. At the current stage research, however, the efficient processing of GNNs is still an open challenge for several reasons. Besides of their novelty, GNNs are hard to compute due to their dependence on the input graph, their combination of dense and very sparse operations, or the need to scale to huge graphs in some applications. In this context, this article aims to make two main contributions. On the one hand, a review of the field of GNNs is presented from the perspective of computing. This includes a brief tutorial on the GNN fundamentals, an overview of the evolution of the field in the last decade, and a summary of operations carried out in the multiple phases of different GNN algorithm variants. On the other hand, an in-depth analysis of current software and hardware acceleration schemes is provided, from which a hardware-software, graph-aware, and communication-centric vision for GNN accelerators is distilled.

References

[1]
2019. Build Graph Nets in Tensorflow. Retrieved from https://github.com/deepmind/graph_nets.
[2]
2019. Food Discovery with Uber Eats: Using Graph Learning to Power Recommendations. Retrieved from https://eng.uber.com/uber-eats-graph-learning.
[3]
2020. PaddlePaddle/PGL. Retrieved from https://github.com/PaddlePaddle/PGL.
[4]
Dennis Abts, Jonathan Ross, Jonathan Sparling, Mark Wong-VanHaren, Max Baker, Tom Hawkins, Andrew Bell, John Thompson, Temesghen Kahsai, Garrin Kimmell, et al. 2020. Think fast: A tensor streaming processor (TSP) for accelerating deep learning workloads. In Proceedings of the ACM/IEEE 47th Annual International Symposium on Computer Architecture (ISCA’20). 145–158.
[5]
Luis B. Almeida. 1990. A learning rule for asynchronous perceptrons with feedback in a combinatorial environment. In Artificial Neural Networks: Concept Learning. 102–111.
[6]
James Atwood and Don Towsley. 2016. Diffusion-convolutional neural networks. In Advances in Neural Information Processing Systems. 1993–2001.
[7]
Adam Auten, Matthew Tomei, and Rakesh Kumar. 2020. Hardware acceleration of graph neural networks. Proceedings of the Design Automation Conference.
[8]
Matej Balog, Bart van Merriënboer, Subhodeep Moitra, Yujia Li, and Daniel Tarlow. 2019. Fast training of sparse graph neural networks on dense hardware. arXiv:1906.11786. Retrieved from https://arxiv.org/abs/1906.11786.
[9]
Niccolò Bandinelli, Monica Bianchini, and Franco Scarselli. 2010. Learning long-term dependencies using layered graph neural networks. In Proceedings of the International Joint Conference on Neural Networks.
[10]
Trinayan Baruah, Kaustubh Shivdikar, Shi Dong, Yifan Sun, Saiful A. Mojumder, Kihoon Jung, José L. Abellán, Yash Ukidave, Ajay Joshi, John Kim, and David Kaeli. 2021. GNNMark: A benchmark suite to characterize graph neural network training on GPUs. In Proceedings of the IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS’21). IEEE, 13–23.
[11]
Peter W. Battaglia, Jessica B. Hamrick, Victor Bapst, Alvaro Sanchez-Gonzalez, Vinicius Zambaldi, Mateusz Malinowski, Andrea Tacchetti, David Raposo, et al. 2018. Relational inductive biases, deep learning, and graph networks. arXiv:1806.01261. Retrieved from https://arxiv.org/abs/1806.01261.
[12]
Peter W. Battaglia, Razvan Pascanu, Matthew Lai, Danilo Rezende, and Koray Kavukcuoglu. 2016. Interaction networks for learning about objects, relations and physics. In Advances in Neural Information Processing Systems, 4502–4510.
[13]
Monica Bianchini, Marco Maggini, Lorenzo Sarti, and Franco Scarselli. 2005. Recursive neural networks for processing graphs with labelled edges: Theory and applications. Neural Netw. 18, 8 (2005), 1040–1050.
[14]
Matthew Botvinick, Sam Ritter, Jane X. Wang, Zeb Kurth-Nelson, Charles Blundell, and Demis Hassabis. 2019. Reinforcement learning, fast and slow. Trends Cogn. Sci. 23, 5 (2019), 408–422.
[15]
Xavier Bresson and Thomas Laurent. 2017. Residual gated graph convnets. arXiv:1711.07553. Retrieved from https://arxiv.org/abs/1711.07553.
[16]
Michael M. Bronstein, Joan Bruna, Yann Lecun, Arthur Szlam, and Pierre Vandergheynst. 2017. Geometric deep learning: Going beyond euclidean data. IEEE Sign. Process. Mag. 34, 4 (2017), 18–42.
[17]
Ivan Brugere, Brian Gallagher, and Tanya Y. Berger-Wolf. 2018. Network structure inference, a survey: Motivations, methods, and applications. ACM Comput. Surv. 51, 2 (2018).
[18]
Joan Bruna, Wojciech Zaremba, Arthur Szlam, and Yann LeCun. 2014. Spectral networks and deep locally connected networks on graphs. Proceedings of the 2nd International Conference on Learning Representations (ICLR’14).
[19]
Ines Chami, Sami Abu-El-Haija, Bryan Perozzi, Christopher Ré, and Kevin Murphy. 2020. Machine learning on graphs: A model and comprehensive taxonomy. arXiv:2005.03675. Retrieved from https://arxiv.org/abs/2005.03675.
[20]
Jie Chen, Tengfei Ma, and Cao Xiao. 2018. FastGCN: Fast learning with graph convolutional networks via importance sampling. In Proceedings of the 6th International Conference on Learning Representations (ICLR’18).
[21]
Jianfei Chen, Jun Zhu, and Le Song. 2018. Stochastic training of graph convolutional networks with variance reduction. In Proceedings of the 35th International Conference on Machine Learning (ICML’18), 1503–1532.
[22]
Ming Chen, Zhewei Wei, Zengfeng Huang, Bolin Ding, and Yaliang Li. 2020. Simple and deep graph convolutional networks. In Proceedings of the International Conference on Machine Learning. 1725–1735.
[23]
Xiaobing Chen, Yuke Wang, Xinfeng Xie, Xing Hu, Abanti Basak, Ling Liang, Mingyu Yan, Lei Deng, Yufei Ding, Zidong Du, and Yuan Xie. 2021. Rubik: A hierarchical architecture for efficient graph neural network training. IEEE Trans. Comput.-Aided Des. Integr. Circ. Syst. (2021). https://ieeexplore.ieee.org/abstract/document/9428002.
[24]
Y. Chen, T. Krishna, J. S. Emer, and V. Sze. 2017. Eyeriss: An energy-efficient reconfigurable accelerator for deep convolutional neural networks. IEEE J. Solid-State Circ. 52, 1 (2017), 127–138.
[25]
Y. Chen, T. Yang, J. Emer, and V. Sze. 2019. Eyeriss v2: A flexible accelerator for emerging deep neural networks on mobile devices. IEEE J. Emerg. Select. Top. Circ. Syst. 9, 2 (2019), 292–308.
[26]
Zhengdao Chen, Joan Bruna, and Lisha Li. 2019. Supervised community detection with line graph neural networks. Proceedings of the 7th International Conference on Learning Representations (ICLR’19).
[27]
Zhiqian Chen, Fanglan Chen, Lei Zhang, Taoran Ji, Kaiqun Fu, Liang Zhao, Feng Chen, and Chang-Tien Lu. 2020. Bridging the gap between spatial and spectral domains: A survey on graph neural networks. arXiv:2002.11867. Retrieved from https://arxiv.org/abs/2002.11867.
[28]
Wei-Lin Chiang, Xuanqing Liu, Si Si, Yang Li, Samy Bengio, and Cho-Jui Hsieh. 2019. Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 257–266.
[29]
Wei-Lin Chiang, Xuanqing Liu, Si Si, Yang Li, Samy Bengio, and Cho-Jui Hsieh. 2019. Cluster-GCN: An efficient algorithm for training deep and large graph convolutional networks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.
[30]
Kyunghyun Cho, Bart Van Merriënboer, Dzmitry Bahdanau, and Yoshua Bengio. 2014. On the properties of neural machine translation: Encoder-decoder approaches. In Proceedings of the 8th Workshop on Syntax, Semantics and Structure in Statistical Translation (SSST-8’14).
[31]
Peng Cui, Xiao Wang, Jian Pei, and Wenwu Zhu. 2019. A survey on network embedding. IEEE Trans. Knowl. Data Eng. 31, 5 (2019), 833–852.
[32]
Zhiyong Cui, Kristian Henrickson, Ruimin Ke, and Yinhai Wang. 2020. Traffic graph convolutional recurrent neural network: A deep learning framework for network-scale traffic learning and forecasting. IEEE Trans. Intell. Transport. Syst. 21, 11 (2020), 4883–4894.
[33]
Rajarshi Das, Shehzaad Dhuliawala, Manzil Zaheer, Luke Vilnis, Ishan Durugkar, Akshay Krishnamurthy, Alex Smola, and Andrew McCallum. 2018. Go for a walk and arrive at the answer: Reasoning over paths in knowledge bases using reinforcement learning. In Proceedings of the 7th International Conference on Learning Representations.
[34]
Shail Dave, Riyadh Baghdadi, Tony Nowatzki, Sasikanth Avancha, Aviral Shrivastava, and Baoxin Li. 2020. Hardware acceleration of sparse and irregular tensor computations of ML models: A survey and insights. arXiv:2007.00864. https://arxiv.org/abs/2007.00864.
[35]
Nicola De Cao and Thomas Kipf. 2018. MolGAN: An implicit generative model for small molecular graphs. In Proceedings of the ICML Workshop on Theoretical Foundations and Applications of Deep Generative Models.
[36]
Jeffrey Dean and Sanjay Ghemawat. 2008. MapReduce: Simplified data processing on large clusters. Commun. ACM 51, 1 (2008), 107–113.
[37]
Michaël Defferrard, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional neural networks on graphs with fast localized spectral filtering. In Advances in Neural Information Processing Systems. 3844–3852.
[38]
Vincenzo Di Massa, Gabriele Monfardini, Lorenzo Sarti, Franco Scarselli, Marco Maggini, and Marco Gori. 2006. A comparison between recursive neural networks and graph neural networks. Proceedings of the IEEE International Conference on Neural Networks, 778–785.
[39]
Zidong Du, Robert Fasthuber, Tianshi Chen, Paolo Ienne, Ling Li, Tao Luo, Xiaobing Feng, Yunji Chen, and Olivier Temam. 2015. ShiDianNao: Shifting vision processing closer to the sensor. In Proceedings of the ACM/IEEE 42nd Annual International Symposium on Computer Architecture (ISCA’15). 92–104.
[40]
Alberto Garcia Duran and Mathias Niepert. 2017. Learning graph representations with embedding propagation. In Advances in Neural Information Processing Systems. 5119–5130.
[41]
David Duvenaud, Dougal Maclaurin, Jorge Aguilera-Iparraguirre, Rafael Gómez-Bombarelli, Timothy Hirzel, Alán Aspuru-Guzik, and Ryan P. Adams. 2015. Convolutional networks on graphs for learning molecular fingerprints. Adv. Neural Inf. Process. Syst. (2015), 2224–2232.
[42]
Vijay Prakash Dwivedi, Chaitanya K Joshi, Thomas Laurent, Yoshua Bengio, and Xavier Bresson. 2020. Benchmarking graph neural networks. In Proceedings of the ICML Workshop on Graph Representation Learning and Beyond.
[43]
Andre Esteva, Brett Kuprel, Roberto A. Novoa, Justin Ko, Susan M. Swetter, Helen M. Blau, and Sebastian Thrun. 2017. Dermatologist-level classification of skin cancer with deep neural networks.Nature 542, 7639 (2017), 115–118.
[44]
Wenqi Fan, Yao Ma, Qing Li, Yuan He, Eric Zhao, Jiliang Tang, and Dawei Yin. 2019. Graph neural networks for social recommendation. Proceedings of the The Web Conference (WWW’19), 417–426.
[45]
Matthias Fey and Jan Eric Lenssen. 2019. Fast graph representation learning with PyTorch Geometric. Proceedings of the International Conference on Learning Representations (ICLR’19).
[46]
Santo Fortunato. 2010. Community detection in graphs. Phys. Rep. 486, 3 (2010), 75–174.
[47]
Alex Fout, Jonathon Byrd, Basir Shariat, and Asa Ben-Hur. 2017. Protein interface prediction using graph convolutional networks. Adv. Neural Inf. Process. Syst.6531–6540.
[48]
Hongyang Gao, Zhengyang Wang, and Shuiwang Ji. 2018. Large-scale learnable graph convolutional networks. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2018).
[49]
Jiyang Gao, Chen Sun, Hang Zhao, Yi Shen, Dragomir Anguelov, Congcong Li, and Cordelia Schmid. 2020. VectorNet: Encoding HD maps and agent dynamics from vectorized representation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR’20). 11522–11530.
[50]
Victor Garcia and Joan Bruna. 2018. Few-shot learning with graph neural networks. Proceedings of the International Conference on Learning Representations.
[51]
Raveesh Garg, Eric Qin, Francisco Muñoz Martínez, Robert Guirado, Akshay Jain, Sergi Abadal, José L. Abellán, Manuel E. Acacio, Eduard Alarcón, Sivasankaran Rajamanickam, and Tushar Krishna. 2021. A taxonomy for classification and comparison of dataflows for GNN accelerators. arXiv:2103.07977. Retrieved from https://arxiv.org/abs/2103.07977.
[52]
Thomas Gärtner, Peter Flach, and Stefan Wrobel. 2003. On graph kernels: Hardness results and efficient alternatives. In Learning Theory and Kernel Machines, Bernhard Schölkopf and Manfred K. Warmuth (Eds.). Springer, Berlin, 129–143.
[53]
Tong Geng, Ang Li, Runbin Shi, Chunshu Wu, Tianqi Wang, Yanfei Li, Pouya Haghi, Antonino Tumeo, Shuai Che, Steve Reinhardt, and Martin C. Herbordt. 2020. AWB-GCN: A graph convolutional network accelerator with runtime workload rebalancing. In Proceedings of the 53rd Annual IEEE/ACM International Symposium on Microarchitecture (MICRO’20). IEEE, 922–936.
[54]
Xu Geng, Yaguang Li, Leye Wang, Lingyu Zhang, Qiang Yang, Jieping Ye, and Yan Liu. 2019. Spatiotemporal multi-graph convolution network for ride-hailing demand forecasting. Proceedings of the AAAI Conference on Artificial Intelligence. 3656–3663.
[55]
Swarnendu Ghosh, Nibaran Das, Teresa Gonçalves, and Paulo Quaresma. 2018. The journey of graph kernels through two decades. Comput. Sci. Rev. 27 (2018), 88–111.
[56]
Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, and George E. Dahl. 2017. Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning. 2053–2070.
[57]
Marco Gori, Gabriele Monfardini, and Franco Scarselli. 2005. A new model for learning in Graph domains. In Proceedings of the International Joint Conference on Neural Networks. 729–734.
[58]
Daniele Grattarola and Cesare Alippi. 2021. Graph neural networks in tensorflow and keras with spektral [Application Notes]. IEEE Comput. Intell. Mag. 16, 1 (2021), 99–106.
[59]
Chuang-Yi Gui, Long Zheng, Bingsheng He, Cheng Liu, Xin-Yu Chen, Xiao-Fei Liao, and Hai Jin. 2019. A survey on graph processing accelerators: Challenges and opportunities. J. Comput. Sci. Technol. 34, 2 (2019), 339–371.
[60]
Robert Guirado, Akshay Jain, Sergi Abadal, and Eduard Alarcón. 2021. Characterizing the communication requirements of GNN accelerators: A model-based approach. In Proceedings of the IEEE International Symposium on Circuits and Systems (ISCAS’21).
[61]
Shengnan Guo, Youfang Lin, Ning Feng, Chao Song, and Huaiyu Wan. 2019. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33. 922–929.
[62]
Erkam Guresen, Gulgun Kayakutlu, and Tugrul U. Daim. 2011. Using artificial neural network models in stock market index prediction. Expert Syst. Appl. 38, 8 (2011), 10389–10397.
[63]
Tae Jun Ham, Lisa Wu, Narayanan Sundaram, Nadathur Satish, and Margaret Martonosi. 2016. Graphicionado: A high-performance and energy-efficient accelerator for graph analytics. In Proceedings of the Annual IEEE/ACM International Symposium on Microarchitecture (MICRO-49). IEEE.
[64]
Takuo Hamaguchi, Hidekazu Oiwa, Masashi Shimbo, and Yuji Matsumoto. 2017. Knowledge transfer for out-of-knowledge-base entities: A graph neural network approach. In Proceedings of the 26th International Joint Conference on Artificial Intelligence. 1802–1808.
[65]
William L. Hamilton, Rex Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems. 1025–1035.
[66]
William L. Hamilton, Rex Ying, and Jure Leskovec. 2017. Representation learning on graphs: Methods and applications. IEEE Data Eng. Bull. 40, 3 (2017), 52–74.
[67]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 770–778.
[68]
Kartik Hegde, Jiyong Yu, Rohit Agrawal, Mengjia Yan, Michael Pellauer, and Christopher Fletcher. 2018. UCNN: Exploiting computational reuse in deep neural networks via weight repetition. In Proceedings of the ACM/IEEE 45th Annual International Symposium on Computer Architecture (ISCA’18). 674–687.
[69]
Mikael Henaff, Joan Bruna, and Yann LeCun. 2015. Deep convolutional networks on graph-structured data. arXiv:1506.05163. Retrieved from https://arxiv.org/abs/1506.05163.
[70]
Tamás Horváth, Thomas Gärtner, and Stefan Wrobel. 2004. Cyclic pattern kernels for predictive graph mining. In Proceedings of the 10th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 158–167.
[71]
Fenyu Hu, Yanqiao Zhu, Shu Wu, Liang Wang, and Tieniu Tan. 2019. Hierarchical graph convolutional networks for semi-supervised node classification. In Proceedings of the International Joint Conference on Artificial Intelligence.
[72]
Weihua Hu, Matthias Fey, Marinka Zitnik, Yuxiao Dong, Hongyu Ren, Bowen Liu, Michele Catasta, and Jure Leskovec. 2020. Open graph benchmark: Datasets for machine learning on graphs. In Advances in Neural Information Processing Systems, Vol. 33. 22118–22133.
[73]
Yuwei Hu, Zihao Ye, Minjie Wang, Jiali Yu, Da Zheng, Mu Li, Zheng Zhang, Zhiru Zhang, and Yida Wang. 2020. FeatGraph: A flexible and efficient backend for graph neural network systems. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis.
[74]
Guyue Huang, Guohao Dai, Yu Wang, and Huazhong Yang. 2020. GE-SpMM: General-purpose Sparse Matrix-Matrix Multiplication on GPUs for Graph Neural Networks. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis. Article 72.
[75]
Michael James, Marvin Tom, Patrick Groeneveld, and Vladimir Kibardin. 2020. Physical mapping of neural networks on a wafer-scale deep learning accelerator. In Proceedings of the International Symposium on Physical Design. 145–149.
[76]
Zhihao Jia, Sina Lin, Mingyu Gao, Matei Zaharia, and Alex Aiken. 2020. Improving the accuracy, scalability, and performance of Graph Neural Networks with ROC. In Proceedings of the Conference on Machine Learning and Systems (MLSys’20).
[77]
Zhihao Jia, Sina Lin, Rex Ying, and Alex Aiken. 2020. Redundancy-Free computation for graph neural networks. In Proceedings of the SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’20).
[78]
Zhihao Jia, Matei Zaharia, and Alex Aiken. 2019. Beyond data and model parallelism for deep neural networks. Proceedings of the Conference on Machine Learning and Systems (SysML’19).
[79]
Xiaodong Jiang, Pengsheng Ji, and Sheng Li. 2019. CensNet: Convolution with edge-node switching in graph neural networks. Proceedings of the International Joint Conference on Artificial Intelligence (2019), 2656–2662.
[80]
Xiangyang Ju, Steven Farrell, Paolo Calafiura, Daniel Murnane, Prabhat, Lindsey Gray, et al. 2019. Graph neural networks for particle reconstruction in high energy physics detectors. In Proceedings of the 2nd Workshop on Machine Learning and the Physical Sciences (NeurIPS’19).
[81]
George Karypis and Vipin Kumar. 1998. A fast and high quality multilevel scheme for partitioning irregular graphs. SIAM J. Sci. Comput. 20, 1 (1998), 359–392.
[82]
Tatsuro Kawamoto, Masashi Tsubaki, and Tomoyuki Obuchi. 2018. Mean-field theory of graph neural networks in graph partitioning. In Advances in Neural Information Processing Systems. 4361–4371.
[83]
Byung-Hoon Kim and Jong Chul Ye. 2020. Understanding graph isomorphism network for rs-fMRI functional connectivity analysis. Front. Neurosci. 14 (2020), 630.
[84]
Kevin Kiningham, Philip Levis, and Christopher Re. 2020. GReTA: Hardware optimized graph processing for GNNs. In Proceedings of the Workshop on Resource-Constrained Machine Learning (ReCoML’20).
[85]
Kevin Kiningham, Christopher Re, and Philip Levis. 2020. GRIP: A graph neural network accelerator architecture. arXiv:2007.13828. Retrieved from https://arxiv.org/abs/2007.13828.
[86]
Thomas N. Kipf and Max Welling. 2016. Variational graph auto-encoders. In Proceedings of the Bayesian Deep Learning Workshop (NIPS’16).
[87]
Thomas N Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. Proceedings of the 5th International Conference on Learning Representations.
[88]
Hyoukjun Kwon and Tushar Krishna. 2017. Rethinking NoCs for spatial neural network accelerators. In Proceedings of the IEEE/ACM International Symposium on Networks-on-Chip (NOCS’17).
[89]
Hyoukjun Kwon, Ananda Samajdar, and Tushar Krishna. 2018. A communication-centric approach for designing flexible DNN accelerators. IEEE Micro 38, 6 (2018), 25–35.
[90]
Hyoukjun Kwon, Ananda Samajdar, and Tushar Krishna. 2018. MAERI: Enabling flexible dataflow mapping over DNN accelerators via reconfigurable interconnects. In Proceedings of the 23rd International Conference on Architectural Support for Programming Languages and Operating Systems. 461–475.
[91]
Luis C. Lamb, Artur d’Avila Garcez, Marco Gori, Marcelo O. R. Prates, Pedro H. C. Avelar, and Moshe Y. Vardi. 2020. Graph neural networks meet neural-symbolic computing: A survey and perspective. (2020), 4877–4884. https://arxiv.org/pdf/2003.00330.pdf.
[92]
Yann Lecun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. Nature 521, 7553 (2015), 436–444.
[93]
John Boaz Lee, Ryan A. Rossi, Sungchul Kim, Nesreen K. Ahmed, and Eunyee Koh. 2019. Attention models in graphs: A survey. ACM Trans. Knowl. Discov. Data 13, 6, Article 62 (2019), 1–25.
[94]
Adam Lerer, Ledell Wu, Jiajun Shen, Timothee Lacroix, Luca Wehrstedt, Abhijit Bose, and Alex Peysakhovich. 2019. PyTorch-BigGraph: A large-scale graph embedding system. In Proceedings of the Conference on Machine Learning and Systems (MLSys’19).
[95]
Guohao Li, Matthias Muller, Ali Thabet, and Bernard Ghanem. 2019. DeepGCNs: Can GCNs go as deep as CNNs?. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 9267–9276.
[96]
Jiajun Li, Ahmed Louri, Avinash Karanth, and Razvan Bunescu. 2021. GCNAX: A flexible and energy-efficient accelerator for graph convolutional neural networks. In Proceedings of the IEEE International Symposium on High-Performance Computer Architecture (HPCA’21). IEEE, 775–788.
[97]
Qimai Li, Zhichao Han, and Xiao Ming Wu. 2018. Deeper insights into graph convolutional networks for semi-supervised learning. In Proceedings of the 32nd AAAI Conference on Artificial Intelligence. 3538–3545.
[98]
Ruiyu Li, Makarand Tapaswi, Renjie Liao, Jiaya Jia, Raquel Urtasun, and Sanja Fidler. 2017. Situation recognition with graph neural networks. In Proceedings of the IEEE International Conference on Computer Vision4183–4192.
[99]
Ruoyu Li, Sheng Wang, Feiyun Zhu, and Junzhou Huang. 2018. Adaptive graph convolutional neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence.
[100]
Xiaoxiao Li, Yuan Zhou, Nicha C. Dvornek, Muhan Zhang, Juntang Zhuang, Pamela Ventola, and James S. Duncan. 2020. Pooling regularized graph neural network for fMRI biomarker analysis. In Proceedings of the Medical Image Computing and Computer Assisted Intervention (MICCAI’20). 625–635.
[101]
Yujia Li, Oriol Vinyals, Chris Dyer, Razvan Pascanu, and Peter Battaglia. 2018. Learning deep generative models of graphs. In Proceedings of the International Conference on Learning Representations (ICLR’18) Workshops.
[102]
Yujia Li, Richard Zemel, Marc Brockschmidt, and Daniel Tarlow. 2016. Gated graph sequence neural networks. In Proceedings of the 4th International Conference on Learning Representation.
[103]
Shengwen Liang, Ying Wang, Cheng Liu, Lei He, Huawei Li, and Xiaowei Li. 2021. EnGN: A high-throughput and energy-efficient accelerator for large graph neural networks. IEEE Trans. Comput. 70, 9 (2021), 1511–1525.
[104]
Renjie Liao, Marc Brockschmidt, Daniel Tarlow, Alexander L. Gaunt, Raquel Urtasun, and Richard S. Zemel. 2018. Graph partition neural networks for semi-supervised classification. Proceedings of the International Conference on Learning Representations (ICLR’18) Workshops.
[105]
Husong Liu, Shengliang Lu, Xinyu Chen, and Bingsheng He. 2020. G3: When graph neural networks meet parallel graph processing systems on GPUs. VLDB Endow. 13, 12 (2020), 2813–2816.
[106]
Lingxiao Ma, Zhi Yang, Youshan Miao, Jilong Xue, Ming Wu, Lidong Zhou, and Yafei Dai. 2019. Neugraph: Parallel deep neural network computation on large graphs. In Proceedings of the USENIX Annual Technical Conference (USENIX ATC’19). 443–458.
[107]
Tianle Ma and Aidong Zhang. 2019. AffinityNet: Semi-supervised few-shot learning for disease type prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33. 1069–1076.
[108]
Yuzhe Ma, Haoxing Ren, Brucek Khailany, Harbinder Sikka, Lijuan Luo, Karthikeyan Natarajan, and Bei Yu. 2019. High performance graph convolutional networks with applications in testability analysis. Proceedings of the Design Automation Conference (2019).
[109]
Yanjun Ma, Dianhai Yu, Tian Wu, and Haifeng Wang. 2019. PaddlePaddle: An open-source deep learning platform from industrial practice. Front. Data Comput. 1, 1 (2019), 105–115.
[110]
Fragkiskos D. Malliaros and Michalis Vazirgiannis. 2013. Clustering and community detection in directed networks: A survey. Phys. Rep. 533, 4 (2013), 95–142.
[111]
Sparsh Mittal. 2020. A survey of FPGA-based accelerators for convolutional neural networks. Neural Comput. Appl. 32 (2020), 1109–1139.
[112]
Gabriele Monfardini, Vincenzo Di Massa, Franco Scarselli, and Marco Gori. 2006. Graph neural networks for object localization. Front. Artif. Intell. Appl. 141 (2006), 665–669.
[113]
Federico Monti, Michael M. Bronstein, and Xavier Bresson. 2017. Geometric matrix completion with recurrent multi-graph neural networks. In Advances in Neural Information Processing Systems. 3698–3708.
[114]
Vojtech Myska, Radim Burget, and Brezany Peter. 2019. Graph neural network for website element detection. Proceedings of the 42nd International Conference on Telecommunications and Signal Processing, 216–219.
[115]
Yusuke Nagasaka, Akira Nukada, Ryosuke Kojima, and Satoshi Matsuoka. 2019. Batched sparse matrix multiplication for accelerating graph convolutional networks. In Proceedings of the 19th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing. 231–240.
[116]
David F. Nettleton. 2013. Data mining of social networks represented as graphs. Comput. Sci. Rev. 7 (2013), 1–34.
[117]
Maximilian Nickel, Kevin Murphy, Volker Tresp, and Evgeniy Gabrilovich. 2015. A review of relational machine learning for knowledge graphs. Proc. IEEE 104, 1 (2015), 11–33.
[118]
Mathias Niepert, Mohamed Ahmad, and Konstantin Kutzkov. 2016. Learning convolutional neural networks for graphs. Proceedings of the 33rd International Conference on Machine Learning. 2958–2967.
[119]
Daniel Oñoro-Rubio, Mathias Niepert, Alberto García-Durán, Roberto González, and Roberto J. López-Sastre. 2017. Answering visual-relational queries in web-extracted knowledge graphs. In Proceedings of the Annual Conference on Automated Knowledge Base Construction (AKBC’17).
[120]
Aldo Pareja, Giacomo Domeniconi, Jie Chen, Tengfei Ma, Toyotaro Suzumura, Hiroki Kanezashi, Tim Kaler, Tao Schardl, and Charles Leiserson. 2020. EvolveGCN: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence. 5363–5370.
[121]
Hogun Park and Jennifer Neville. 2019. Exploiting interaction links for node classification with deep graph neural networks. In Proceedings of the International Joint Conference on Artificial Intelligence.
[122]
Fernando J. Pineda. 1987. Generalization of back-propagation to recurrent neural networks. Phys. Rev. Lett. 59, 19 (1987), 2229–2232.
[123]
Jiezhong Qiu, Jian Tang, Hao Ma, Yuxiao Dong, Kuansan Wang, and Jie Tang. 2018. DeepInf: Social influence prediction with deep learning. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (2018).
[124]
Afshin Rahimi, Trevor Cohn, and Timothy Baldwin. 2018. Semi-supervised user geolocation via graph convolutional networks. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2009–2019.
[125]
Krzysztof Rusek and Piotr Cholda. 2019. Message-passing neural networks learn little’s law. IEEE Commun. Lett. 23, 2 (2019), 274–277.
[126]
Krzysztof Rusek, José Suárez-Varela, Albert Mestres, Pere Barlet-Ros, and Albert Cabellos-Aparicio. 2019. Unveiling the potential of Graph Neural Networks for network modeling and optimization in SDN. In Proceedings of the ACM Symposium on SDN Research (2019).
[127]
Guillaume Salha, Romain Hennequin, and Michalis Vazirgiannis. 2019. Keep it simple: Graph autoencoders without graph convolutional networks. In Proceedings of the NeurIPS’19 Graph Representation Learning Workshop.
[128]
Alvaro Sanchez-Gonzalez, Nicolas Heess, Jost Tobias Springenberg, Josh Merel, Martin Riedmiller, Raia Hadsell, and Peter Battaglia. 2018. Graph networks as learnable physics engines for inference and control. In Proceedings of the 35th International Conference on Machine Learning. 7097–7117.
[129]
Benjamin Sanchez-Lengeling, Jennifer N. Wei, Brian K. Lee, Richard C. Gerkin, Alán Aspuru-Guzik, and Alexander B. Wiltschko. 2019. Machine learning for scent: Learning generalizable perceptual representations of small molecules. arXiv:1910.10685. Retrieved from https://arxiv.org/abs/1910.10685.
[130]
Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and G. Monfardini. 2009. The Graph Neural Network Model. IEEE Trans. Neural Netw. 20, 1 (2009), 61–80.
[131]
Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2009. Computational capabilities of graph neural networks. IEEE Trans. Neural Netw. 20, 1 (2009), 81–102.
[132]
Franco Scarselli, Markus Hagenbuchner, Sweah Liang Yong, Ah Chung Tsoi, Marco Gori, and Marco Maggini. 2005. Graph neural networks for ranking web pages. In Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence. 666–672.
[133]
Michael Schlichtkrull, Thomas N. Kipf, Peter Bloem, Rianne van den Berg, Ivan Titov, and Max Welling. 2018. Modeling Relational Data with Graph Convolutional Networks. Lecture Notes in Computer Science (2018), 593–607.
[134]
Weijing Shi and Raj Rajkumar. 2020. Point-gnn: Graph neural network for 3d object detection in a point cloud. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 1711–1719.
[135]
Hy Truong Son and Chris Jones. 2019. Graph neural networks with efficient tensor operations in CUDA/GPU and Graphflow deep learning framework in C++ for quantum chemistry. Retrieved from http://people.cs.uchicago.edu/ hytruongson/CCN-GraphFlow.pdf.
[136]
Peter C. St. John, Caleb Phillips, Travis W. Kemper, A. Nolan Wilson, Yanfei Guan, Michael F. Crowley, Mark R. Nimlos, and Ross E. Larsen. 2019. Message-passing neural networks for high-throughput polymer screening. J. Chem. Phys. 150, 23 (2019).
[137]
Sainbayar Sukhbaatar, Arthur Szlam, and Rob Fergus. 2016. Learning multiagent communication with backpropagation. In Advances in Neural Information Processing Systems. 2244–2252.
[138]
Vivienne Sze, Yu-Hsin Chen, Tien-Ju Yang, and Joel S. Emer. 2017. Efficient processing of deep neural networks: A tutorial and survey. Proc. IEEE 105, 12 (2017), 2295–2329.
[139]
Zhenheng Tang, Shaohuai Shi, Xiaowen Chu, Wei Wang, and Bo Li. 2020. Communication-efficient distributed deep learning: A comprehensive survey. arXiv:2003.06307v1. Retrieved from https://arxiv.org/abs/2003.06307v1.
[140]
Kiran K. Thekumparampil, Chong Wang, Sewoong Oh, and Li-Jia Li. 2018. Attention-based graph neural network for semi-supervised learning. arXiv:1803.03735. Retrieved from https://arxiv.org/abs/1803.03735.
[141]
Chao Tian, Lingxiao Ma, Zhi Yang, and Yafei Dai. 2020. PCGCN: Partition-centric processing for accelerating graph convolutional network. Proceedings of the IEEE International Parallel & Distributed Processing Symposium (IPDPS’20) (2020), 936–945.
[142]
Dominika Tkaczyk, Paweł Szostek, Mateusz Fedoryszak, Piotr Jan Dendek, and Łukasz Bolikowski. 2015. CERMINE: Automatic extraction of structured metadata from scientific literature. Int. J. Doc. Anal. Recogn. 18, 4 (2015), 317–335.
[143]
Dinh V. Tran, Nicolò Navarin, and Alessandro Sperduti. 2018. On filter size in graph convolutional networks. In Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI’18). IEEE, 1534–1541.
[144]
Alok Tripathy, Katherine Yelick, and Aydin Buluc. [n.d.]. Reducing communication in graph neural network training. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis (SC). 987–1000.
[145]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems. 6000–6010.
[146]
Petar Veličković, Arantxa Casanova, Pietro Liò, Guillem Cucurull, Adriana Romero, and Yoshua Bengio. 2018. Graph attention networks. In Proceedings of the 6th International Conference on Learning Representations.
[147]
Manisha Verma and Debasis Ganguly. 2019. Graph edit distance computation via graph neural networks yunsheng. In Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval. 1281–1284.
[148]
Saurabh Verma and Zhi Li Zhang. 2019. Stability and generalization of graph convolutional neural networks. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 1539–1548.
[149]
Jiang Wang, Yi Yang, Junhua Mao, Zhiheng Huang, Chang Huang, and Wei Xu. 2016. CNN-RNN: A unified framework for multi-label image classification. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2285–2294.
[150]
Lei Wang, Qiang Yin, Chao Tian, Jianbang Yang, Rong Chen, Wenyuan Yu, Zihang Yao, and Jingren Zhou. 2021. FlexGraph: A flexible and efficient distributed framework for GNN training. In Proceedings of the 16th European Conference on Computer Systems. 67–82.
[151]
Minjie Wang, Lingfan Yu, Da Zheng, Quan Gan, Yu Gai, Zihao Ye, Mufei Li, Jinjing Zhou, Qi Huang, Chao Ma, et al. 2019. Deep graph library: Towards efficient and scalable deep learning on graphs. arXiv:1909.01315. Retrieved from https://arxiv.org/abs/1909.01315.
[152]
Xiaolong Wang, Ross Girshick, Abhinav Gupta, and Kaiming He. 2018. Non-local Neural Networks. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition7794–7803.
[153]
Xuhong Wang, Ding Lyu, Mengjian Li, Yang Xia, Qi Yang, Xinwen Wang, Xinguang Wang, Ping Cui, Yupu Yang, Bowen Sun, et al. 2021. APAN: Asynchronous propagation attention network for real-time temporal graph embedding. In Proceedings of the International Conference on Management of Data. 2628–2638.
[154]
Yangzihao Wang, Andrew Davidson, Yuechao Pan, Yuduo Wu, Andy Riffel, and John D Owens. 2016. Gunrock: A high-performance graph processing library on the GPU. In Proceedings of the 21st ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming.
[155]
Yuke Wang, Boyuan Feng, Gushu Li, Shuangchen Li, Lei Deng, Yuan Xie, and Yufei Ding. 2021. GNNAdvisor: An efficient runtime system for GNN acceleration on GPUs. In Proceedings of the USENIX Symposium on Operating Systems Design and Implementation (OSDI’21).
[156]
Jim Webber. 2012. A programmatic introduction to neo4j. In Proceedings of the 3rd Annual Conference on Systems, Programming, and Applications: Software for Humanity.
[157]
Boris Weisfeiler and Andrei Leman. 1968. A reduction of a graph to a canonical form and an algebra arising during this reduction.Nauchno-Techn. Inf. (1968), 2–16.
[158]
Felix Wu, Amauri Souza, Tianyi Zhang, Christopher Fifty, Tao Yu, and Kilian Weinberger. 2019. Simplifying graph convolutional networks. In Proceedings of the International Conference on Machine Learning. 6861–6871.
[159]
Jun Wu, Jingrui He, and Jiejun Xu. 2019. Demo-Net: Degree-specific graph neural networks for node and graph classification. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 406–415.
[160]
Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and S Yu Philip. 2021. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems 32, 1 (2021), 4–24.
[161]
Zhipu Xie, Weifeng Lv, Shangfo Huang, Zhilong Lu, and Bowen Du. 2020. Sequential graph neural network for urban road traffic speed prediction. IEEE Access 8 (2020), 63349–63358.
[162]
Keyulu Xu, Stefanie Jegelka, Weihua Hu, and Jure Leskovec. 2019. How powerful are graph neural networks?Proceedings of the 7th International Conference on Learning Representations (2019).
[163]
Mingyu Yan, Zhaodong Chen, Lei Deng, Xiaochun Ye, Zhimin Zhang, Dongrui Fan, and Yuan Xie. 2020. Characterizing and understanding GCNs on GPU. IEEE Comput. Arch. Lett. 19, 1 (2020), 22–25.
[164]
Mingyu Yan, Lei Deng, Xing Hu, Ling Liang, Yujing Feng, Xiaochun Ye, Zhimin Zhang, Dongrui Fan, and Yuan Xie. 2020. HyGCN: A GCN accelerator with hybrid architecture. In Proceedings of the IEEE International Symposium on High Performance Computer Architecture (HPCA’20). 15–29.
[165]
Sijie Yan, Yuanjun Xiong, and Dahua Lin. 2018. Spatial temporal graph convolutional networks for skeleton-based action recognition. In Proceedings of the AAAI Conference on Artificial Intelligence.
[166]
Yiding Yang, Xinchao Wang, Mingli Song, Junsong Yuan, and Dacheng Tao. 2019. SPAGAN: Shortest Path Graph Attention Network. In Proceedings of the International Joint Conference on Artificial Intelligence.
[167]
Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L. Hamilton, and Jure Leskovec. 2018. Graph convolutional neural networks for web-scale recommender systems. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 974–983.
[168]
Rex Ying, Jiaxuan You, Christopher Morris, Xiang Ren, William L. Hamilton, and Jure Leskovec. 2018. Hierarchical graph representation learning with differentiable pooling. In Proceedings of the 32nd International Conference on Neural Information Processing Systems. 4805–4815.
[169]
Zhitao Ying, Dylan Bourgeois, Jiaxuan You, Marinka Zitnik, and Jure Leskovec. 2019. Gnnexplainer: Generating explanations for graph neural networks. In Advances in Neural Information Processing Systems. 9244–9255.
[170]
Sweah Liang Yong, Markus Hagenbuchner, Ah Chung Tsoi, Franco Scarselli, and Marco Gori. 2006. Document mining using graph neural network. In Proceedings of the International Workshop of the Initiative for the Evaluation of XML Retrieval. Springer, 458–472.
[171]
T. Young, D. Hazarika, S. Poria, and E. Cambria. 2018. Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 13, 3 (2018), 55–75.
[172]
Bing Yu, Haoteng Yin, and Zhanxing Zhu. 2018. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In Proceedings of the 27th International Joint Conference on Artificial Intelligence.
[173]
Wenchao Yu, Cheng Zheng, Wei Cheng, Charu C Aggarwal, Dongjin Song, Bo Zong, Haifeng Chen, and Wei Wang. 2018. Learning deep network representations with adversarially regularized autoencoders. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.
[174]
Victoria Zayats and Mari Ostendorf. 2018. Conversation modeling on Reddit using a graph-structured LSTM. Trans. Assoc. Comput. Ling. 6 (2018), 121–132.
[175]
Hanqing Zeng and Viktor Prasanna. 2020. GraphACT: Accelerating GCN training on CPU-FPGA heterogeneous platforms. In Proceedings of the ACM/SIGDA International Symposium on Field-Programmable Gate Arrays. 255–265.
[176]
Hanqing Zeng, Hongkuan Zhou, Ajitesh Srivastava, Rajgopal Kannan, and Viktor Prasanna. 2019. GraphSAINT: Graph sampling based inductive learning method. In Proceedings of the International Conference on Learning Representations.
[177]
Bingyi Zhang, Hanqing Zeng, and Viktor Prasanna. 2020. Hardware acceleration of large scale GCN inference. In Proceedings of the IEEE International Conference on Application-specific Systems, Architectures and Processors (ASAP’20). 61–68.
[178]
Dalong Zhang, Xin Huang, Ziqi Liu, Jun Zhou, Zhiyang Hu, Xianzheng Song, Zhibang Ge, Lin Wang, Zhiqiang Zhang, and Yuan Qi. 2020. AGL: A scalable system for industrial-purpose graph machine learning. VLDB Endow. 13, 12 (2020), 3125–3137.
[179]
Li Zhang, Dan Xu, Anurag Arnab, and Philip HS Torr. 2020. Dynamic graph message passing networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 3726–3735.
[180]
Muhan Zhang and Yixin Chen. 2018. Link prediction based on graph neural networks. In Advances in Neural Information Processing Systems.
[181]
Ziwei Zhang, Peng Cui, and Wenwu Zhu. 2020. Deep learning on graphs: A survey. IEEE Trans. Knowl. Data Eng. 14, 8 (2020).
[182]
Zhihui Zhang, Jingwen Leng, Lingxiao Ma, Youshan Miao, Chao Li, and Minyi Guo. 2020. Architectural implications of graph neural networks. IEEE Comput. Arch. Lett. 19, 1 (2020), 59–62.
[183]
Ziwei Zhang, Chenhao Niu, Peng Cui, Bo Zhang, Wei Cui, and Wenwu Zhu. 2020. A simple and general graph neural network with stochastic message passing. arXiv:2009.02562. Retrieved from https://arxiv.org/abs/2009.02562.
[184]
Da Zheng, Chao Ma, Minjie Wang, Jinjing Zhou, Qidong Su, Xiang Song, Quan Gan, Zheng Zhang, and George Karypis. 2020. DistDGL: Distributed graph neural network training for billion-scale graphs. In Proceedings of the IEEE/ACM Workshop on Irregular Applications: Architectures and Algorithms (IA3’20).
[185]
Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun. 2020. Graph neural networks: A review of methods and applications. AI Open 1 (2020), 57–81.
[186]
Di Zhu and Yu Liu. 2018. Modelling spatial patterns using graph convolutional networks. Leibniz Int. Proc. Inf. 114 (2018).
[187]
Rong Zhu, Kun Zhao, Hongxia Yang, Wei Lin, Chang Zhou, Baole Ai, Yong Li, and Jingren Zhou. 2018. AliGraph: A comprehensive graph neural network platform. VLDB Endow. (2018), 2094–2105.
[188]
Daniel Zügner and Stephan Günnemann. 2019. Adversarial attacks on graph neural networks via meta learning. In Proceedings of the 7th International Conference on Learning Representations (2019).

Cited By

View all
  • (2024)A Survey of Computationally Efficient Graph Neural Networks for Reconfigurable SystemsInformation10.3390/info1507037715:7(377)Online publication date: 28-Jun-2024
  • (2024)A Survey on Graph Neural Network Acceleration: A Hardware PerspectiveChinese Journal of Electronics10.23919/cje.2023.00.13533:3(601-622)Online publication date: May-2024
  • (2024)Eliminating Data Processing Bottlenecks in GNN Training over Large Graphs via Two-level Feature CompressionProceedings of the VLDB Endowment10.14778/3681954.368196817:11(2854-2866)Online publication date: 1-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Computing Surveys
ACM Computing Surveys  Volume 54, Issue 9
December 2022
800 pages
ISSN:0360-0300
EISSN:1557-7341
DOI:10.1145/3485140
Issue’s Table of Contents
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 October 2021
Accepted: 01 July 2021
Revised: 01 July 2021
Received: 01 November 2020
Published in CSUR Volume 54, Issue 9

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Graph neural networks
  2. GNN algorithms
  3. accelerators
  4. graph embeddings

Qualifiers

  • Survey
  • Refereed

Funding Sources

  • European Union’s Horizon 2020 Research and Innovation Programme
  • Spanish Ministry of Economy and Competitiveness
  • FEDER

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3,519
  • Downloads (Last 6 weeks)271
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A Survey of Computationally Efficient Graph Neural Networks for Reconfigurable SystemsInformation10.3390/info1507037715:7(377)Online publication date: 28-Jun-2024
  • (2024)A Survey on Graph Neural Network Acceleration: A Hardware PerspectiveChinese Journal of Electronics10.23919/cje.2023.00.13533:3(601-622)Online publication date: May-2024
  • (2024)Eliminating Data Processing Bottlenecks in GNN Training over Large Graphs via Two-level Feature CompressionProceedings of the VLDB Endowment10.14778/3681954.368196817:11(2854-2866)Online publication date: 1-Jul-2024
  • (2024)XGNN: Boosting Multi-GPU GNN Training via Global GNN Memory StoreProceedings of the VLDB Endowment10.14778/3641204.364121917:5(1105-1118)Online publication date: 2-May-2024
  • (2024)Sparsity-Aware Communication for Distributed Graph Neural Network TrainingProceedings of the 53rd International Conference on Parallel Processing10.1145/3673038.3673152(117-126)Online publication date: 12-Aug-2024
  • (2024)Distributed Graph Neural Network Training: A SurveyACM Computing Surveys10.1145/364835856:8(1-39)Online publication date: 10-Apr-2024
  • (2024)Towards Lightweight Graph Neural Network Search with Curriculum Graph SparsificationProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671706(3563-3573)Online publication date: 25-Aug-2024
  • (2024)Parallel and Distributed Graph Neural Networks: An In-Depth Concurrency AnalysisIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.330343146:5(2584-2606)Online publication date: May-2024
  • (2024)Z-Laplacian Matrix Factorization: Network Embedding With Interpretable Graph SignalsIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.333102736:8(4331-4345)Online publication date: 1-Aug-2024
  • (2024)An Efficient GCN Accelerator Based on Workload Reorganization and Feature ReductionIEEE Transactions on Circuits and Systems I: Regular Papers10.1109/TCSI.2023.334351571:2(646-659)Online publication date: Feb-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media