Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Advertisement

Temporal patterns decomposition and Legendre projection for long-term time series forecasting

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Long-term time series forecasting (LTSF) means utilizing historical data to forecast future sequences that are relatively distant in time, providing support for long-term warnings, planning, and decision-making. LTSF is more challenging than short-term forecasting due to its larger output length. It requires forecasting methods to accurately capture long-term temporal dependencies from complex sequences with intertwined temporal patterns. For LTSF tasks, existing works propose variants of recurrent neural networks, convolutional neural networks, and transformers to catch temporal dependencies. However, these methods usually suffer from the insufficient ability to capture long-term temporal dependencies and excessively high complexity, resulting in unreliable forecasting performance. Therefore, we propose an LTSF method based on temporal patterns decomposition and Legendre projection (TPDLP). Firstly, we use temporal patterns decomposition to handle complex temporal patterns to perform decomposed refinement forecasting. Subsequently, we use high-order Legendre polynomial projection with a signal transfer module based on multilayer perceptron networks to capture long-term temporal dependencies, thereby achieving LTSF. Furthermore, we introduce targeted data normalization to alleviate the impact of distribution shifts on sequence forecasting. Through extensive experimentation with six popular real-world datasets, our TPDLP model shows an average relative improvement of 15.8% compared to the best baseline in terms of performance, measured by prediction error. In addition, it also demonstrates superior efficiency, which showcases its utility in real-world applications. Code is available at this repository: https://github.com/JoeDoex/TPDLP.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

All data generated or analyzed during this study are included in this submitted article.

Code availability

Code will be made available on request.

Notes

  1. https://www.bgc-jena.mpg.de/wetter/.

  2. http://pems.dot.ca.gov.

  3. https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.

  4. https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html.

References

  1. Miao K-C, Han T-T, Yao Y-Q, Lu H, Chen P, Wang B, Zhang J (2020) Application of LSTM for short term fog forecasting based on meteorological elements. Neurocomputing 408:285–291

    Article  Google Scholar 

  2. Jallal MA, Gonzalez-Vidal A, Skarmeta AF, Chabaa S, Zeroual A (2020) A hybrid neuro-fuzzy inference system-based algorithm for time series forecasting applied to energy consumption prediction. Appl Energy 268:114977

    Article  Google Scholar 

  3. Guan B, Zhao C, Yuan X, Long J, Li X (2023) Price prediction in China stock market: an integrated method based on time series clustering and image feature extraction. J Supercomput 80:1–39

    Google Scholar 

  4. Ma X, Zhong H, Li Y, Ma J, Cui Z, Wang Y (2020) Forecasting transportation network speed using deep capsule networks with nested LSTM models. IEEE Trans Intell Transp 22(8):4813–4824

    Article  Google Scholar 

  5. Rathipriya R, Abdul Rahman AA, Dhamodharavadhani S, Meero A, Yoganandan G (2023) Demand forecasting model for time-series pharmaceutical data using shallow and deep neural network model. Neural Comput Appl 35(2):1945–1957

    Article  Google Scholar 

  6. Júnior DSdOS, Oliveira JF, Mattos Neto PS (2019) An intelligent hybridization of Arima with machine learning models for time series forecasting. Knowl-Based Syst 175:72–86

    Article  Google Scholar 

  7. Li S, Jin X, Xuan Y, Zhou X, Chen W, Wang Y-X, Yan X (2019) Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems, vol 32

  8. Kitaev N, Kaiser Ł, Levskaya A (2020) Reformer: the efficient transformer. In: International Conference on Learning Representations. https://doi.org/10.48550/arXiv.2001.04451

  9. Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, Zhang W (2021) Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 35, pp 11106–11115

  10. Wu H, Xu J, Wang J, Long M (2021) Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. In: Advances in Neural Information Processing Systems, vol 34, pp 22419–22430

  11. Gao C, Zhang N, Li Y, Bian F, Wan H (2022) Self-attention-based time-variant neural networks for multi-step time series forecasting. Neural Comput Appl 34(11):8737–8754

    Article  Google Scholar 

  12. Oreshkin BN, Carpov D, Chapados N, Bengio Y (2019) N-BEATS: neural basis expansion analysis for interpretable time series forecasting. In: International Conference on Learning Representations. https://doi.org/10.48550/arXiv.1905.10437

  13. Olivares KG, Challu C, Marcjasz G, Weron R, Dubrawski A (2023) Neural basis expansion analysis with exogenous variables: forecasting electricity prices with NBEATSx. Int J Forecast 39(2):884–900

    Article  Google Scholar 

  14. Lim B, Arık SÖ, Loeff N, Pfister T (2021) Temporal fusion transformers for interpretable multi-horizon time series forecasting. Int J Forecast 37(4):1748–1764

    Article  Google Scholar 

  15. Nicholson WB, Wilms I, Bien J, Matteson DS (2020) High dimensional forecasting via interpretable vector autoregression. J Mach Learn Res 21(1):6690–6741

    MathSciNet  Google Scholar 

  16. Cai H, Jia X, Feng J, Li W, Hsu Y-M, Lee J (2020) Gaussian process regression for numerical wind speed prediction enhancement. Renew Energy 146:2112–2123

    Article  Google Scholar 

  17. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  18. Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv:1412.3555

  19. Sen R, Yu H-F, Dhillon IS (2019) Think globally, act locally: A deep neural network approach to high-dimensional time series forecasting. In: Advances in Neural Information Processing systems, vol 32

  20. Wang H, Peng J, Huang F, Wang J, Chen J, Xiao Y (2023) MICN: multi-scale local and global context modeling for long-term series forecasting. In: International Conference on Learning Representations

  21. Zhou T, Ma Z, Wen Q, Wang X, Sun L, Jin R (2022) FEDformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning. PMLR, pp 27268–27286

  22. Lai G, Chang W-C, Yang Y, Liu H (2018) Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp 95–104

  23. Salinas D, Flunkert V, Gasthaus J, Januschowski T (2020) DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int J Forecast 36(3):1181–1191

    Article  Google Scholar 

  24. Voelker A, Kajić I, Eliasmith C (2019) Legendre memory units: continuous-time representation in recurrent neural networks. In: Advances in Neural Information Processing Systems, vol 32

  25. Gu A, Dao T, Ermon S, Rudra A, Ré C (2020) Hippo: recurrent memory with optimal polynomial projections. In: Advances in Neural Information Processing Systems, vol 33, pp 1474–1487

  26. Li ZL, Zhang GW, Yu J, Xu LY (2023) Dynamic graph structure learning for multivariate time series forecasting. Pattern Recognit 138:109423

    Article  Google Scholar 

  27. Guo S, Lin Y, Wan H, Li X, Cong G (2022) Learning dynamics and heterogeneity of spatial-temporal graph data for traffic forecasting. IEEE Trans Knowl Data Eng 34:5415–5428

    Article  Google Scholar 

  28. Wu Z, Pan S, Long G, Jiang J, Chang X, Zhang C (2020) Connecting the dots: multivariate time series forecasting with graph neural networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp 95–104

  29. Yi Y, Tian Y, He C, Fan Y, Hu X, Xu Y (2023) DBT: multimodal emotion recognition based on dual-branch transformer. J Supercomput 79(8):8611–8633

    Article  Google Scholar 

  30. Zhou H, Ma T, Rong H, Qian Y, Tian Y, Al-Nabhan N (2022) MDMN: multi-task and domain adaptation based multi-modal network for early rumor detection. Expert Syst Appl 195:116517

    Article  Google Scholar 

  31. Ma T, Rong H, Hao Y, Cao J, Tian Y, Al-Rodhaan M (2022) A novel sentiment polarity detection framework for Chinese. IEEE Trans Affect Comput 13(1):60–74. https://doi.org/10.1109/TAFFC.2019.2932061

    Article  Google Scholar 

  32. Wankhade M, Annavarapu CSR, Abraham A (2023) MAPA BiLSTM-BERT: multi-aspects position aware attention for aspect level sentiment analysis. J Supercomput 79(10):11452–11477

    Article  Google Scholar 

  33. Carion N, Massa F, Synnaeve G, Usunier N, Kirillov A, Zagoruyko S (2020) End-to-end object detection with transformers. In: Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part I 16. Springer, Berlin, pp 213–229

  34. Han J, Yang G, Wei H, Gong W, Qian Y (2023) ST-YOLOX: a lightweight and accurate object detection network based on Swin transformer. J Supercomput 80:1–22

    Google Scholar 

  35. Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, Lin S, Guo B (2021) Swin transformer: hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp 10012–10022

  36. Arnab A, Dehghani M, Heigold G, Sun C, Lučić M, Schmid C (2021) Vivit: a video vision transformer. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp 6836–6846

  37. Lange H, Brunton SL, Kutz JN (2021) From Fourier to Koopman: spectral methods for long-term time series prediction. J Mach Learn Res 22(1):1881–1918

    MathSciNet  Google Scholar 

  38. Cao D, Wang Y, Duan J, Zhang C, Zhu X, Huang C, Tong Y, Xu B, Bai J, Tong J et al (2020) Spectral temporal graph neural network for multivariate time-series forecasting. In: Advances in Neural Information Processing Systems, vol 33, pp 17766–17778

  39. Wang R, Li C, Fu W, Tang G (2019) Deep learning method based on gated recurrent unit and variational mode decomposition for short-term wind power interval prediction. IEEE Trans Neural Netw Learn 31(10):3814–3827

    Article  MathSciNet  Google Scholar 

  40. Radojičić D, Kredatus S (2020) The impact of stock market price Fourier transform analysis on the gated recurrent unit classifier model. Expert Syst Appl 159:113565

    Article  Google Scholar 

  41. Li Y, Si S, Li G, Hsieh C-J, Bengio S (2021) Learnable Fourier features for multi-dimensional spatial positional encoding. In: Advances in Neural Information Processing Systems, vol 34, pp 15816–15829

  42. Ai Z, Wu G, Li B, Wang Y, Chen C (2022) Fourier enhanced MLP with adaptive model pruning for efficient federated recommendation. In: Knowledge Science, Engineering and Management: 15th International Conference, KSEM 2022, Singapore, August 6–8, 2022, Proceedings, Part III. Springer, Berlin, pp 356–368

  43. Chen L, Li G, Huang G, Zhao Q (2023) A lightweight model using frequency, trend and temporal attention for long sequence time-series prediction. Neural Comput Appl 35(28):21291–21307

    Article  Google Scholar 

  44. Şahinuç F, Koç A (2022) Fractional Fourier transform meets transformer encoder. IEEE Signal Process Lett 29:2258–2262

    Article  Google Scholar 

  45. Zhao X, Zhang M, Tao R, Li W, Liao W, Tian L, Philips W (2022) Fractional Fourier image transformer for multimodal remote sensing data classification. IEEE Trans Neural Netw Learn 35:1–13

    Google Scholar 

  46. Singh S, Mohapatra A et al (2019) Repeated wavelet transform based Arima model for very short-term wind speed forecasting. Renew Energy 136:758–768

    Article  Google Scholar 

  47. Shamshirband S, Nodoushan EJ, Adolf JE, Manaf AA, Mosavi A, Chau K-W (2019) Ensemble models with uncertainty analysis for multi-day ahead forecasting of chlorophyll a concentration in coastal waters. Eng Appl Comput Fluid 13(1):91–101

    Google Scholar 

  48. Gupta G, Xiao X, Bogdan P (2021) Multiwavelet-based operator learning for differential equations. In: Advances in Neural Information Processing Systems, vol 34, pp 24048–24062

  49. Zheng X, Jia D, Lv Z, Luo C, Zhao J, Ye Z (2023) Short-time wind speed prediction based on Legendre multi-wavelet neural network. CAAI Trans Intell Technol 8:946–962

    Article  Google Scholar 

  50. Kim T, Kim J, Tae Y, Park C, Choi J-H, Choo J (2019) Reversible instance normalization for accurate time-series forecasting against distribution shift. In: International Conference on Learning Representations

  51. De Sa C, Cu A, Puttagunta R, Ré C, Rudra A (2018) A two-pronged progress in structured dense matrix vector multiplication. In: Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms. SIAM, pp 1060–1079

  52. Zhang T, Zhang Y, Cao W, Bian J, Yi X, Zheng S, Li J (2022) Less is more: fast multivariate time series forecasting with light sampling-oriented MLP structures. https://doi.org/10.48550/arXiv.2207.01186

  53. Cao H, Huang Z, Yao T, Wang J, He H, Wang Y (2023) InParformer: evolutionary decomposition transformers with interactive parallel attention for long-term time series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 37, pp 6906–6915

  54. Wang X, Liu H, Du J, Yang Z, Dong X (2023) CLformer: locally grouped auto-correlation and convolutional transformer for long-term multivariate time series forecasting. Eng Appl Artif Intell 121:106042

    Article  Google Scholar 

  55. Nasiri H, Ebadzadeh MM (2023) Multi-step-ahead stock price prediction using recurrent fuzzy neural network and variational mode decomposition. Appl Soft Comput 148:110867

    Article  Google Scholar 

Download references

Funding

This work was supported in part by the National Key Research and Development Program of China (No.2021YFE014400). This work was supported in part by the National Natural Science Foundation of China (No.62102187).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tinghuai Ma.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, J., Ma, T., Su, Y. et al. Temporal patterns decomposition and Legendre projection for long-term time series forecasting. J Supercomput 80, 23407–23441 (2024). https://doi.org/10.1007/s11227-024-06313-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-024-06313-4

Keywords