Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Learning evolving relations for multivariate time series forecasting

Published: 15 March 2024 Publication History

Abstract

Multivariate time series forecasting is essential in various fields, including healthcare and traffic management, but it is a challenging task due to the strong dynamics in both intra-channel relations (temporal patterns within individual variables) and inter-channel relations (the relationships between variables), which can evolve over time with abrupt changes. This paper proposes ERAN (Evolving Relational Attention Network), a framework for multivariate time series forecasting, that is capable to capture such dynamics of these relations. On the one hand, ERAN represents inter-channel relations with a graph which evolves over time, modeled using a recurrent neural network. On the other hand, ERAN represents the intra-channel relations using a temporal attentional convolution, which captures the local temporal dependencies adaptively with the input data. The elvoving graph structure and the temporal attentional convolution are intergrated in a unified model to capture both types of relations. The model is experimented on a large number of real-life datasets including traffic flows, energy consumption, and COVID-19 transmission data. The experimental results show a significant improvement over the state-of-the-art methods in multivariate time series forecasting particularly for non-stationary data.

References

[1]
Alexandrov A, Benidis K, Bohlke-Schneider M et al (2020) Gluonts: probabilistic and neural time series modeling in python. J Mach Learn Res 21(116):1–6. http://jmlr.org/papers/v21/19-820.html
[2]
Bai L, Yao L, Li C et al (2020) Adaptive graph convolutional recurrent network for traffic forecasting. In: Larochelle H, Ranzato M, Hadsell R et al (eds) Advances in neural information processing systems, vol 33. Curran Associates, Inc., pp 17,804–17,815
[3]
Bai S, Kolter JZ, Koltun V (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv:1803.01271
[4]
Bruna J, Zaremba W, Szlam A et al (2014) Spectral networks and locally connected networks on graphs. International conference on learning representations (ICLR 2014)
[5]
Cao D, Wang Y, Duan J et al (2020) Spectral temporal graph neural network for multivariate time-series forecasting. Adv Neural Inf Process Syst 33
[6]
Cho K, van Merriënboer B, Gulcehre C et al (2014) Learning phrase representations using RNN encoder–decoder for statistical machine translation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1724–1734
[7]
Conti M and Turchetti C Approximation of dynamical systems by continuous-time recurrent approximate identity neural networks Neural Parallel Sci Comput 1994 2 3 299-320
[8]
Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th international conference on neural information processing systems, NIPS’16, pp 3844–3852
[9]
Dickey DA and Fuller WA Distribution of the estimators for autoregressive time series with a unit root J Am Stat Assoc 1979 74 366 427-431
[10]
Dudek G et al. Filev D, Jabłkowski J, Kacprzyk J, et al. Short-term load forecasting using random forests Intelligent systems’2014 2015 Cham Springer International Publishing 821-828
[11]
Gilmer J, Schoenholz SS, Riley PF et al (2017) Neural message passing for quantum chemistry. In: Proceedings of the 34th international conference on machine learning, ICML’17, vol 70. pp 1263–1272
[12]
Guo S, Lin Y, Feng N et al (2019) Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In: Proceedings of the AAAI conference on artificial intelligence, pp 922–929
[13]
He K, Zhang X, Ren S et al (2016) Deep residual learning for image recognition. In: 2016 IEEE conference on computer vision and pattern recognition (CVPR), pp 770–778.
[14]
Hochreiter S and Schmidhuber J Long short-term memory Neural Comput 1997 9 8 1735-1780
[15]
Hyndman R and Athanasopoulos G Forecasting: principles and practice 2021 3 Australia OTexts
[16]
Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: Proceedings of the 5th international conference on learning representations, ICLR ’17
[17]
Kitaev N, Kaiser L, Levskaya A (2020) Reformer: the efficient transformer. In: International conference on learning representations. https://openreview.net/forum?id=rkgNKkHtvB
[18]
Lai G, Chang WC, Yang Y et al (2018) Modeling long- and short-term temporal patterns with deep neural networks. In: The 41st international ACM SIGIR conference on research and development in information retrieval, SIGIR’18. New York, pp 95–104
[19]
Li S, Jin X, Xuan Y et al (2019) Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in neural information processing systems, pp 5243–5253
[20]
Li Y, Yu R, Shahabi C et al (2018) Diffusion convolutional recurrent neural network: data-driven traffic forecasting. In: International conference on learning representations (ICLR ’18)
[21]
Lim B, Arık SÖ, Loeff N et al (2021) Temporal fusion transformers for interpretable multi-horizon time series forecasting. Int J Forecasting 37(4):1748–1764
[22]
Mills TC Time series techniques for economists 1990 Cambridge University Press
[23]
Nguyen D, Nguyen B, Nguyen P et al (2021) High-order representation learning for multivariate time series forecasting. In: Time series workshop@ICML 2021
[24]
Oord Avd, Dieleman S, Zen H et al (2016) Wavenet: a generative model for raw audio. arXiv:1609.03499
[25]
Pai PF, Lin KP, Lin CS et al (2010) Time series forecasting by a seasonal support vector regression model. Expert Syst Appl 37(6):4261–4265., https://www.sciencedirect.com/science/article/pii/S0957417409010185
[26]
Pareja A, Domeniconi G, Chen J et al (2020) EvolveGCN: evolving graph convolutional networks for dynamic graphs. In: Proceedings of the thirty-fourth AAAI conference on artificial intelligence
[27]
Pham T, Tran T, Phung D et al (2017) Column networks for collective classification. In: Proceedings of AAAI conference on artificial intelligence
[28]
Rangapuram SS, Seeger MW, Gasthaus J et al (2018) Deep state space models for time series forecasting. In: Bengio S, Wallach H, Larochelle H et al (eds) Advances in neural information processing systems, vol 31. Curran Associates, Inc., https://proceedings.neurips.cc/paper_files/paper/2018file/5cf68969fb67aa6082363a6d4e6468e2-Paper.pdf
[29]
Rasul K, Seward C, Schuster I et al (2021) Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting. In: International conference on machine learning. https://api.semanticscholar.org/CorpusID:231719657
[30]
Salinas D, Flunkert V, Gasthaus J et al (2020) Deepar: probabilistic forecasting with autoregressive recurrent networks. Int J Forecasting 36(3):1181–1191. https://www.sciencedirect.com/science/article/pii/S0169207019301888
[31]
Shaw P, Uszkoreit J, Vaswani A (2018) Self-attention with relative position representations. In: Proceedings of the 2018 conference of the North American chapter of the association for computational linguistics: human language technologies, vol 2 (Short Papers). Association for Computational Linguistics, New Orleans, pp 464–468., https://aclanthology.org/N18-2074
[32]
Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Proceedings of the 31st international conference on neural information processing systems, NIPS’17, pp 6000–6010
[33]
Wu N, Green B, Ben X et al (2020a) Deep transformer models for time series forecasting: the influenza prevalence case. arXiv:2001.08317
[34]
Wu Z, Pan S, Long G et al (2019) Graph WaveNet for deep spatial-temporal graph modeling. In: Proceedings of the twenty-eighth international joint conference on artificial intelligence, IJCAI-19. International Joint Conferences on Artificial Intelligence Organization, pp 1907–1913
[35]
Wu Z, Pan S, Long G et al (2020b) Connecting the dots: multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery and data mining. Association for Computing Machinery, New York, pp 753–763
[36]
Yu B, Yin H, Zhu Z (2018) Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting. In: Proceedings of the 27th international joint conference on artificial intelligence (IJCAI)
[37]
Zhang Y, Yan J (2023) Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: International conference on learning representations
[38]
Zhou H, Zhang S, Peng J et al (2021) Informer: beyond efficient transformer for long sequence time-series forecasting. In: The thirty-fifth AAAI conference on artificial intelligence, AAAI 2021. AAAI Press, p online

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Applied Intelligence
Applied Intelligence  Volume 54, Issue 5
Mar 2024
800 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 15 March 2024
Accepted: 06 December 2023

Author Tags

  1. Time series forecasting
  2. Multivariate time series forecasting
  3. Dynamic graph neural networks
  4. Attention mechanism

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media