Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
tutorial
Open access

Graph Time-series Modeling in Deep Learning: A Survey

Published: 28 February 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Time-series and graphs have been extensively studied for their ubiquitous existence in numerous domains. Both topics have been separately explored in the field of deep learning. For time-series modeling, recurrent neural networks or convolutional neural networks model the relations between values across timesteps, while for graph modeling, graph neural networks model the inter-relations between nodes. Recent research in deep learning requires simultaneous modeling for time-series and graphs when both representations are present. For example, both types of modeling are necessary for time-series classification, regression, and anomaly detection in graphs. This article aims to provide a comprehensive summary of these models, which we call graph time-series models. To the best of our knowledge, this is the first survey article that provides a picture of related models from the perspective of deep graph time-series modeling to address a range of time-series tasks, including regression, classification, and anomaly detection. Graph time-series models are split into two categories: (a) graph recurrent/convolutional neural networks and (b) graph attention neural networks. Under each category, we further categorize models based on their properties. Additionally, we compare representative models and discuss how distinctive model characteristics are utilized with respect to various model components and data challenges. Pointers to commonly used datasets and code are included to facilitate access for further research. In the end, we discuss potential directions for future research.

    References

    [1]
    Uri Alon and Eran Yahav. 2020. On the bottleneck of graph neural networks and its practical implications. In Proceedings of the International Conference on Learning Representations.
    [2]
    Anthony Bagnall, Hoang Anh Dau, Jason Lines, Michael Flynn, James Large, Aaron Bostrom, Paul Southam, and Eamonn Keogh. 2018. The UEA multivariate time series classification archive, 2018. arXiv preprint arXiv:1811.00075 (2018).
    [3]
    D. Bahdanau, K. Cho, and Y. Bengio. 2015. Neural machine translation by jointly learning to align and translate. In Proceedings of the International Conference on Learning Representations (ICLR’15).
    [4]
    Claudio D. T. Barros, Matheus R. F. Mendonça, Alex B. Vieira, and Artur Ziviani. 2021. A survey on embedding dynamic graphs. ACM Comput. Surv. 55, 1 (2021), 1–37.
    [5]
    Konstantinos Benidis, Syama Sundar Rangapuram, Valentin Flunkert, Yuyang Wang, Danielle Maddix, Caner Turkmen, Jan Gasthaus, Michael Bohlke-Schneider, David Salinas, Lorenzo Stella, François-Xavier Aubet, Laurent Callot, and Tim Januschowski. 2022. Deep learning for time series forecasting: Tutorial and literature survey. Comput. Surv. 55, 6 (2022), 1–36.
    [6]
    Ane Blázquez-García, Angel Conde, Usue Mori, and Jose A. Lozano. 2021. A review on outlier/anomaly detection in time series data. ACM Comput. Surv. 54, 3 (2021), 1–33.
    [7]
    Paul Boniol and Themis Palpanas. 2020. Series2Graph: Graph-based subsequence anomaly detection for time series. Proc. VLDB Endow. 13, 12 (July2020), 1821–1834. DOI:
    [8]
    Defu Cao, Yujing Wang, Juanyong Duan, Ce Zhang, Xia Zhu, Conguri Huang, Yunhai Tong, Bixiong Xu, Jing Bai, Jie Tong, and Qi Zhang. 2020. Spectral temporal graph neural network for multivariate time-series forecasting. Adv. Neural Inf. Process. Syst. 33 (2020), 17766–17778.
    [9]
    Hongjie Chen, Ryan A. Rossi, Kanak Mahadik, Sungchul Kim, and Hoda Eldardiry. 2021. Graph deep factors for forecasting with applications to cloud resource allocation. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 106–116.
    [10]
    Wenchao Chen, Long Tian, Bo Chen, Liang Dai, Zhibin Duan, and Mingyuan Zhou. 2022. Deep variational graph convolutional recurrent network for multivariate time series anomaly detection. In Proceedings of the International Conference on Machine Learning. PMLR, 3621–3633.
    [11]
    Xu Chen, Junshan Wang, and Kunqing Xie. 2021. TrafficStream: A streaming traffic flow forecasting framework based on graph neural networks and continual learning. In Proceedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI’21), Zhi-Hua Zhou (Ed.). International Joint Conferences on Artificial Intelligence Organization, 3620–3626. DOI:
    [12]
    Yuzhou Chen, Ignacio Segovia, and Yulia R. Gel. 2021. Z-GCNETs: Time zigzags at graph convolutional networks for time series forecasting. In Proceedings of the 38th International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 139), Marina Meila and Tong Zhang (Eds.). PMLR, 1684–1694. Retrieved from https://proceedings.mlr.press/v139/chen21o.html
    [13]
    Andrea Cini, Ivan Marisca, and Cesare Alippi. 2022. Filling the G_ap_s: Multivariate time series imputation by graph neural networks. In Proceedings of the International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=kOu3-S3wJ7
    [14]
    Jonathan Crabbé and Mihaela Van Der Schaar. 2021. Explaining time series predictions with dynamic masks. In Proceedings of the 38th International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 139), Marina Meila and Tong Zhang (Eds.). PMLR, 2166–2177. Retrieved from https://proceedings.mlr.press/v139/crabbe21a.html
    [15]
    Marco Cuturi and Mathieu Blondel. 2017. Soft-DTW: A differentiable loss function for time-series. In Proceedings of the 34th International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 70), Doina Precup and Yee Whye Teh (Eds.). PMLR, 894–903. Retrieved from https://proceedings.mlr.press/v70/cuturi17a.html
    [16]
    Enyan Dai and Jie Chen. 2022. Graph-augmented normalizing flows for anomaly detection of multiple time series. arXiv preprint arXiv:2202.07857 (2022).
    [17]
    Hoang Anh Dau, Anthony Bagnall, Kaveh Kamgar, Chin-Chia Michael Yeh, Yan Zhu, Shaghayegh Gharghabi, Chotirat Ann Ratanamahatana, and Eamonn Keogh. 2019. The UCR time series archive. IEEE/CAA J. Automat. Sinic. 6, 6 (2019), 1293–1305.
    [18]
    Ailin Deng and Bryan Hooi. 2021. Graph neural network-based anomaly detection in multivariate time series. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 4027–4035.
    [19]
    Songgaojun Deng, Shusen Wang, Huzefa Rangwala, Lijing Wang, and Yue Ning. 2020. Cola-GNN: Cross-location attention based graph neural networks for long-term ILI prediction. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management. 245–254.
    [20]
    Dheeru Dua and Casey Graff. 2017. UCI Machine Learning Repository. Retrieved from http://archive.ics.uci.edu/ml
    [21]
    Vijay Ekambaram, Kushagra Manglik, Sumanta Mukherjee, Surya Shravan Kumar Sajja, Satyam Dwivedi, and Vikas Raykar. 2020. Attention based multi-modal new product sales time-series forecasting. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD ’20). Association for Computing Machinery, New York, NY, 3110–3118. DOI:
    [22]
    Philippe Esling and Carlos Agon. 2012. Time-series data mining. ACM Comput. Surv. 45, 1 (2012), 1–34.
    [23]
    Mehrdad Farajtabar, Yichen Wang, Manuel Gomez Rodriguez, Shuang Li, Hongyuan Zha, and Le Song. 2015. COEVOLVE: A joint point process model for information diffusion and network co-evolution. Adv. Neural Inf. Process. Syst. 28 (2015).
    [24]
    Vincent Fortuin, Matthias Hüser, Francesco Locatello, Heiko Strathmann, and Gunnar Rätsch. 2018. SOM-VAE: Interpretable discrete representation learning on time series. In Proceedings of the International Conference on Learning Representations.
    [25]
    Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 855–864.
    [26]
    Shengnan Guo, Youfang Lin, Ning Feng, Chao Song, and Huaiyu Wan. 2019. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33. 922–929.
    [27]
    Shubham Gupta and Srikanta Bedathur. 2022. A survey on temporal graph representation learning and generative modeling. arXiv preprint arXiv:2208.12126 (2022).
    [28]
    James D. Hamilton. 2020. Time Series Analysis. Princeton University Press.
    [29]
    Siho Han and Simon S. Woo. 2022. Learning sparse latent graph representations for anomaly detection in multivariate time series. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2977–2986.
    [30]
    Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 770–778.
    [31]
    Dan Hendrycks and Kevin Gimpel. 2016. Gaussian error linear units (GELUs). arXiv preprint arXiv:1606.08415 (2016).
    [32]
    Thi Kieu Khanh Ho, Ali Karami, and Narges Armanfard. 2023. Graph-based time-series anomaly detection: A survey. arXiv preprint arXiv:2302.00058 (2023).
    [33]
    Min Hou, Chang Xu, Zhi Li, Yang Liu, Weiqing Liu, Enhong Chen, and Jiang Bian. 2022. Multi-granularity residual learning with confidence estimation for time series prediction. In Proceedings of the ACM Web Conference. 112–121.
    [34]
    Xiao Huang, Qingquan Song, Yuening Li, and Xia Hu. 2019. Graph recurrent networks with attributed random walks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 732–740.
    [35]
    Won-Seok Hwang, Jeong-Han Yun, Jonguk Kim, and Hyoung Chun Kim. 2019. Time-series aware precision and recall for anomaly detection: Considering variety of detection result and addressing ambiguous labeling. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management. 2241–2244.
    [36]
    Aya Abdelsalam Ismail, Mohamed Gunady, Hector Corrada Bravo, and Soheil Feizi. 2020. Benchmarking deep learning interpretability in time series predictions. Adv. Neural Inf. Process. Syst. 33 (2020), 6441–6452.
    [37]
    Hassan Ismail Fawaz, Germain Forestier, Jonathan Weber, Lhassane Idoumghar, and Pierre-Alain Muller. 2019. Deep learning for time series classification: A review. Data Min. Knowl. Discov. 33, 4 (2019), 917–963.
    [38]
    Eric Jang, Shixiang Gu, and Ben Poole. 2017. Categorical reparameterization with Gumbel-Softmax. In Proceedings of the 5th International Conference on Learning Representations (ICLR’17). OpenReview.net. Retrieved from https://openreview.net/forum?id=rkE3y85ee
    [39]
    Guolin Ke, Zhenhui Xu, Jia Zhang, Jiang Bian, and Tie-Yan Liu. 2019. DeepGBM: A deep learning framework distilled by GBDT for online prediction tasks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 384–394.
    [40]
    Eamonn Keogh and Shruti Kasetty. 2003. On the need for time series data mining benchmarks: A survey and empirical demonstration. Data Min. Knowl. Discov. 7, 4 (2003), 349–371.
    [41]
    Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In Proceedings of the 5th International Conference on Learning Representations (ICLR’17). OpenReview.net. Retrieved from https://openreview.net/forum?id=SJU4ayYgl
    [42]
    Srijan Kumar, Xikun Zhang, and Jure Leskovec. 2019. Predicting dynamic embedding trajectory in temporal interaction networks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1269–1278.
    [43]
    Guokun Lai, Wei-Cheng Chang, Yiming Yang, and Hanxiao Liu. 2018. Modeling long-and short-term temporal patterns with deep neural networks. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. 95–104.
    [44]
    Nikolay Laptev, Saeed Amizadeh, and Ian Flint. 2015. Generic and scalable framework for automated time-series anomaly detection. In Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 1939–1947.
    [45]
    John Boaz Lee, Ryan A. Rossi, Sungchul Kim, Nesreen K. Ahmed, and Eunyee Koh. 2019. Attention models in graphs: A survey. ACM Trans. Knowl. Discov. Data 13, 6 (2019), 1–25.
    [46]
    Jia Li, Zhichao Han, Hong Cheng, Jiao Su, Pengyun Wang, Jianfeng Zhang, and Lujia Pan. 2019. Predicting path failure in time-evolving graphs. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1279–1289.
    [47]
    Mengzhang Li and Zhanxing Zhu. 2021. Spatial-temporal fusion graph neural networks for traffic flow forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 4189–4196.
    [48]
    Steven Cheng-Xian Li and Benjamin Marlin. 2020. Learning from irregularly-sampled time series: A missing data perspective. In Proceedings of the 37th International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 119), Hal Daumé III and Aarti Singh (Eds.). PMLR, 5937–5946. Retrieved from https://proceedings.mlr.press/v119/li20k.html
    [49]
    Wei Li, Ruihan Bao, Keiko Harimoto, Deli Chen, Jingjing Xu, and Qi Su. 2020. Modeling the stock relation with graph network for overnight stock movement prediction. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI’20). 4541–4547.
    [50]
    Yujia Li, Daniel Tarlow, Marc Brockschmidt, and Richard S. Zemel. 2016. Gated graph sequence neural networks. In Proceedings of the 4th International Conference on Learning Representations (ICLR’16), Yoshua Bengio and Yann LeCun (Eds.). Retrieved from http://arxiv.org/abs/1511.05493
    [51]
    Yaguang Li, Rose Yu, Cyrus Shahabi, and Yan Liu. 2018. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In Proceedings of the International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=SJiHXGWAZ
    [52]
    Yang Li, Xianli Zhang, Buyue Qian, Zeyu Gao, Chong Guan, Yefeng Zheng, Hansen Zheng, Fenglang Wu, and Chen Li. 2021. Towards interpretability and personalization: A predictive framework for clinical time-series analysis. In Proceedings of the IEEE International Conference on Data Mining (ICDM’21). IEEE, 340–349.
    [53]
    T. Warren Liao. 2005. Clustering of time series data-a survey. Pattern Recog. 38, 11 (2005), 1857–1874.
    [54]
    Bryan Lim and Stefan Zohren. 2021. Time-series forecasting with deep learning: A survey. Philos. Trans. R. Soc. A 379, 2194 (2021), 20200209.
    [55]
    Juncheng Liu, Kenji Kawaguchi, Bryan Hooi, Yiwei Wang, and Xiaokui Xiao. 2021. EiGNN: Efficient infinite-depth graph neural networks. Adv. Neural Inf. Process. Syst. 34 (2021).
    [56]
    Andreas Loukas. 2019. What graph neural networks cannot learn: Depth vs width. In Proceedings of the International Conference on Learning Representations.
    [57]
    Bin Lu, Xiaoying Gan, Haiming Jin, Luoyi Fu, and Haisong Zhang. 2020. Spatiotemporal adaptive gated graph convolution network for urban traffic flow forecasting. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management. 1025–1034.
    [58]
    Jiaqi Ma, Bo Chang, Xuefei Zhang, and Qiaozhu Mei. 2020. CopulaGNN: Towards integrating representational and correlational roles of graphs in graph neural networks. In Proceedings of the International Conference on Learning Representations.
    [59]
    Andrew L. Maas, Awni Y. Hannun, Andrew Y. Ng, and others. 2013. Rectifier nonlinearities improve neural network acoustic models. In Proc. icml 30, 1 (2013), 3.
    [60]
    David McDowall, Richard McCleary, and Bradley J. Bartos. 2019. Interrupted Time Series Analysis. Oxford University Press.
    [61]
    Takaaki Nakamura, Makoto Imamura, Ryan Mercer, and Eamonn Keogh. 2020. MERLIN: Parameter-free discovery of arbitrary length anomalies in massive time series archives. In Proceedings of the IEEE International Conference on Data Mining (ICDM’20). IEEE, 1190–1195.
    [62]
    Boris N. Oreshkin, Arezou Amini, Lucy Coyle, and Mark Coates. 2021. FC-GAGA: Fully connected gated graph architecture for spatio-temporal traffic forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 9233–9241.
    [63]
    Boris N. Oreshkin, Dmitri Carpov, Nicolas Chapados, and Yoshua Bengio. 2019. N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In Proceedings of the International Conference on Learning Representations.
    [64]
    Benjamin Paassen, Daniele Grattarola, Daniele Zambon, Cesare Alippi, and Barbara Eva Hammer. 2020. Graph edit networks. In Proceedings of the International Conference on Learning Representations.
    [65]
    Chao Pan, Siheng Chen, and Antonio Ortega. 2020. Spatio-temporal graph scattering transform. In Proceedings of the International Conference on Learning Representations.
    [66]
    Cheonbok Park, Chunggi Lee, Hyojin Bahng, Yunwon Tae, Seungmin Jin, Kihwan Kim, Sungahn Ko, and Jaegul Choo. 2020. ST-GRAT: A novel spatio-temporal graph attention networks for accurately forecasting dynamically changing road speed. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management. 1215–1224.
    [67]
    Ramit Sawhney, Shivam Agarwal, Arnav Wadhwa, Tyler Derr, and Rajiv Ratn Shah. 2021. Stock selection via spatiotemporal hypergraph attention network: A learning to rank approach. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 497–504.
    [68]
    Mona Schirmer, Mazin Eltayeb, Stefan Lessmann, and Maja Rudolph. 2022. Modeling irregular time series with continuous recurrent units. In Proceedings of the International Conference on Machine Learning. PMLR, 19388–19405.
    [69]
    Youngjoo Seo, Michaël Defferrard, Pierre Vandergheynst, and Xavier Bresson. 2018. Structured sequence modeling with graph convolutional recurrent networks. In Proceedings of the International Conference on Neural Information Processing. Springer, 362–373.
    [70]
    Chao Shang, Jie Chen, and Jinbo Bi. 2020. Discrete graph structure learning for forecasting multiple time series. In Proceedings of the International Conference on Learning Representations.
    [71]
    Satya Narayan Shukla and Benjamin Marlin. 2018. Interpolation-prediction networks for irregularly sampled time series. In Proceedings of the International Conference on Learning Representations.
    [72]
    Joakim Skarding, Bogdan Gabrys, and Katarzyna Musial. 2021. Foundations and modeling of dynamic networks using dynamic graph neural networks: A survey. IEEE Access 9 (2021), 79143–79168.
    [73]
    Kamile Stankeviciute, Ahmed M. Alaa, and Mihaela van der Schaar. 2021. Conformal time-series forecasting. Adv. Neural Inf. Process. Syst. 34 (2021).
    [74]
    Kiran K. Thekumparampil, Chong Wang, Sewoong Oh, and Li-Jia Li. 2018. Attention-based graph neural network for semi-supervised learning. arXiv preprint arXiv:1803.03735 (2018).
    [75]
    Alasdair Tran, Alexander Mathews, Cheng Soon Ong, and Lexing Xie. 2021. Radflow: A recurrent, aggregated, and decomposable model for networks of time series. In Proceedings of the Web Conference. 730–742.
    [76]
    Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Proceedings of the Conference on Advances in Neural Information Processing Systems. 5998–6008.
    [77]
    Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph attention networks. In Proceedings of the International Conference on Learning Representations. Retrieved from https://openreview.net/forum?id=rJXMpikCZ
    [78]
    Senzhang Wang, Jiannong Cao, and Philip S. Yu. 2022. Deep learning for spatio-temporal data mining: A survey. IEEE Transactions on Knowledge and Data Engineering 34, 8 (2022), 3681–3700. DOI:
    [79]
    Xiaoyang Wang, Yao Ma, Yiqi Wang, Wei Jin, Xin Wang, Jiliang Tang, Caiyan Jia, and Jian Yu. 2020. Traffic flow prediction via spatial temporal graph neural network. In Proceedings of the Web Conference. 1082–1092.
    [80]
    Dongxian Wu, Yisen Wang, Shu-Tao Xia, James Bailey, and Xingjun Ma. 2020. Skip connections matter: On the transferability of adversarial examples generated with ResNets. arXiv preprint arXiv:2002.05990 (2020).
    [81]
    Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and S. Yu Philip. 2020. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32, 1 (2020), 4–24.
    [82]
    Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Xiaojun Chang, and Chengqi Zhang. 2020. Connecting the dots: Multivariate time series forecasting with graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 753–763.
    [83]
    Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, and Chengqi Zhang. 2019. Graph WaveNet for deep spatial-temporal graph modeling. In Proceedings of the International Joint Conference on Artificial Intelligence. Association for the Advancement of Artificial Intelligence (AAAI), 1907–1913.
    [84]
    Lianghao Xia, Chao Huang, Yong Xu, Peng Dai, Liefeng Bo, Xiyue Zhang, and Tianyi Chen. 2021. Spatial-temporal sequential hypergraph network for crime prediction with dynamic multiplex relation learning. In Proceedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI’21), Zhi-Hua Zhou (Ed.). International Joint Conferences on Artificial Intelligence Organization, 1631–1637. DOI:
    [85]
    Sheng Xiang, Dawei Cheng, Chencheng Shang, Ying Zhang, and Yuqi Liang. 2022. Temporal and heterogeneous graph neural network for financial time series prediction. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management. 3584–3593.
    [86]
    Mingxing Xu, Wenrui Dai, Chunmiao Liu, Xing Gao, Weiyao Lin, Guo-Jun Qi, and Hongkai Xiong. 2020. Spatial-temporal transformer networks for traffic flow forecasting. arXiv preprint arXiv:2001.02908 (2020).
    [87]
    Chao-Han Huck Yang, Yun-Yun Tsai, and Pin-Yu Chen. 2021. Voice2Series: Reprogramming acoustic models for time series classification. In Proceedings of the 38th International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 139), Marina Meila and Tong Zhang (Eds.). PMLR, 11808–11819. Retrieved from https://proceedings.mlr.press/v139/yang21j.html
    [88]
    Junchen Ye, Zihan Liu, Bowen Du, Leilei Sun, Weimiao Li, Yanjie Fu, and Hui Xiong. 2022. Learning the evolutionary and multi-scale graph structure for multivariate time series forecasting. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2296–2306.
    [89]
    Rex Ying, Dylan Bourgeois, Jiaxuan You, Marinka Zitnik, and Jure Leskovec. 2019. Gnnexplainer: Generating explanations for graph neural networks. Adv. Neural Inf. Process. Syst. 32 (2019), 9240.
    [90]
    Bing Yu, Haoteng Yin, and Zhanxing Zhu. 2018. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In Proceedings of the 27th International Joint Conference on Artificial Intelligence. 3634–3640.
    [91]
    Hongyuan Yu, Ting Li, Weichen Yu, Jianguo Li, Yan Huang, Liang Wang, and Alex Liu. 2022. Regularized graph structure learning with semantic knowledge for multi-variates time-series forecasting. arXiv preprint arXiv:2210.06126 (2022).
    [92]
    Jiani Zhang, Xingjian Shi, Junyuan Xie, Hao Ma, Irwin King, and Dit Yan Yeung. 2018. GaAN: Gated attention networks for learning on large and spatiotemporal graphs. In Proceedings of the 34th Conference on Uncertainty in Artificial Intelligence (UAI’18).
    [93]
    Junbo Zhang, Yu Zheng, and Dekang Qi. 2017. Deep spatio-temporal residual networks for citywide crowd flows prediction. In Proceedings of the 31st AAAI Conference on Artificial Intelligence.
    [94]
    Muhan Zhang and Yixin Chen. 2018. Link prediction based on graph neural networks. Adv. Neural Inf. Process. Syst. 31 (2018).
    [95]
    Weiqi Zhang, Chen Zhang, and Fugee Tsung. 2022. GReLeN: Multivariate time series anomaly detection from the perspective of graph relational learning. In Proceedings of the 31st International Joint Conference on Artificial Intelligence (IJCAI’22). 2390–2397.
    [96]
    Xiang Zhang, Marko Zeman, Theodoros Tsiligkaridis, and Marinka Zitnik. 2021. Graph-guided network for irregularly sampled multivariate time series. arXiv preprint arXiv:2110.05357 (2021).
    [97]
    Ziwei Zhang, Peng Cui, and Wenwu Zhu. 2022. Deep learning on graphs: A survey. IEEE Transactions on Knowledge and Data Engineering 34, 1 (2022), 249–270. DOI:
    [98]
    Hang Zhao, Yujing Wang, Juanyong Duan, Congrui Huang, Defu Cao, Yunhai Tong, Bixiong Xu, Jing Bai, Jie Tong, and Qi Zhang. 2020. Multivariate time-series anomaly detection via graph attention network. In Proceedings of the IEEE International Conference on Data Mining (ICDM’20). IEEE, 841–850.
    [99]
    Lingxiao Zhao and Leman Akoglu. 2019. PairNorm: Tackling oversmoothing in GNNs. In Proceedings of the International Conference on Learning Representations.
    [100]
    Ling Zhao, Yujiao Song, Chao Zhang, Yu Liu, Pu Wang, Tao Lin, Min Deng, and Haifeng Li. 2019. T-GCN: A temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transport. Syst. 21, 9 (2019), 3848–3858.
    [101]
    Chuanpan Zheng, Xiaoliang Fan, Cheng Wang, and Jianzhong Qi. 2020. GMAN: A graph multi-attention network for traffic prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34. 1234–1241.
    [102]
    Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun. 2020. Graph neural networks: A review of methods and applications. AI Open 1 (2020), 57–81.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Knowledge Discovery from Data
    ACM Transactions on Knowledge Discovery from Data  Volume 18, Issue 5
    June 2024
    699 pages
    ISSN:1556-4681
    EISSN:1556-472X
    DOI:10.1145/3613659
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 February 2024
    Online AM: 23 December 2023
    Accepted: 15 December 2023
    Revised: 24 August 2023
    Received: 13 April 2022
    Published in TKDD Volume 18, Issue 5

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Graph neural networks
    2. time-series modeling
    3. spatial-temporal networks
    4. evolving graphs

    Qualifiers

    • Tutorial

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 2,675
      Total Downloads
    • Downloads (Last 12 months)2,675
    • Downloads (Last 6 weeks)566
    Reflects downloads up to 27 Jul 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    Get Access

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media