Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

DPHM-Net:de-redundant multi-period hybrid modeling network for long-term series forecasting

Published: 22 June 2024 Publication History

Abstract

Deep learning models have been widely applied in the field of long-term forecasting has achieved significant success, with the incorporation of inductive bias such as periodicity to model multi-granularity representations of time series being a commonly employed design approach in forecasting methods. However, existing methods still face challenges related to information redundancy during the extraction of inductive bias and the learning process for multi-granularity features. The presence of redundant information can impede the acquisition of a comprehensive temporal representation by the model, thereby adversely impacting its predictive performance. To address the aforementioned issues, we propose a De-Redundant Multi-Period Hybrid Modeling Network (DPHM-Net) that effectively eliminates redundant information from the series inductive bias extraction mechanism and the multi-granularity series features in the time series representation learning. In DPHM-Net, we propose an efficient time series representation learning process based on a period inductive bias and introduce the concept of de-redundancy among multiple time series into the representation learning process for single time series. Additionally, we design a specialized gated unit to dynamically balance the elimination weights between series features and redundant semantic information. The advanced performance and high efficiency of our method in long-term forecasting tasks against previous state-of-the-art are demonstrated through extensive experiments on real-world datasets.

References

[1]
Song, H., Rajan, D., Thiagarajan, J., Spanias, A.: Attend and diagnose: Clinical time series analysis using attention models. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
[2]
Patton A Copula methods for forecasting multivariate time series Handbook of economic forecasting 2013 2 899-960
[3]
Angryk RA, Martens PC, Aydin B, Kempton D, Mahajan SS, Basodi S, Ahmadzadeh A, Cai X, Filali Boubrahimi S, Hamdi SM, et al. Multivariate time series dataset for space weather data analytics Scientific data 2020 7 1 227
[4]
Demirel, Ö.F., Zaim, S., Çalişkan, A., Özuyar, P Forecasting natural gas consumption in istanbul using neural networks and multivariate time series methods. Turkish J. Electr. Eng. Comput. Sci. 20(5), 695–711 (2012)
[5]
Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., Xiao, Y.: Micn: Multi-scale local and global context modeling for long-term series forecasting. In: The Eleventh International Conference on Learning Representations (2022)
[6]
Zhang, Y., Yan, J.: Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: The Eleventh International Conference on Learning Representations (2022)
[7]
Nie, Y., Nguyen, N.H., Sinthong, P., Kalagnanam, J.: A time series is worth 64 words: Long-term forecasting with transformers. arXiv preprint arXiv:2211.14730 (2022)
[8]
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286 (2022). PMLR
[9]
Wu H, Xu J, Wang J, and Long M Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting Adv. Neural. Inf. Process. Syst. 2021 34 22419-22430
[10]
Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., Long, M.: Timesnet: Temporal 2d-variation modeling for general time series analysis. arXiv preprint arXiv:2210.02186 (2022)
[11]
Borovykh, A., Bohte, S., Oosterlee, C.W.: Conditional time series forecasting with convolutional neural networks. arXiv preprint arXiv:1703.04691 (2017)
[12]
Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., Rabinovich, A.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9 (2015)
[13]
Sen, R., Yu, H.-F., Dhillon, I.S.: Think globally, act locally: A deep neural network approach to high-dimensional time series forecasting. Advances in neural information processing systems 32 (2019)
[14]
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. Advances in neural information processing systems 30 (2017)
[15]
Tenney, I., Das, D., Pavlick, E.: Bert rediscovers the classical nlp pipeline. arXiv preprint arXiv:1905.05950 (2019)
[16]
Kang, W.-C., McAuley, J.: Self-attentive sequential recommendation. In: 2018 IEEE International Conference on Data Mining (ICDM), pp. 197–206 (2018). IEEE
[17]
Dosovitskiy, A., Beyer, L., Kolesnikov, A., Weissenborn, D., Zhai, X., Unterthiner, T., Dehghani, M., Minderer, M., Heigold, G., Gelly, S., et al.: An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929 (2020)
[18]
Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., Zhang, W.: Informer: Beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)
[19]
Kitaev, N., Kaiser, Ł., Levskaya, A.: Reformer: The efficient transformer. arXiv preprint arXiv:2001.04451 (2020)
[20]
Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: Etsformer: Exponential smoothing transformers for time-series forecasting. arXiv preprint arXiv:2202.01381 (2022)
[21]
Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, pp. 11121–11128 (2023)
[22]
Gardner ES Jr Exponential smoothing: The state of the art J. Forecast. 1985 4 1 1-28
[23]
Gardner, E.S., Jr.: Exponential smoothing: The state of the —part ii. Int. J. Forecast. 22(4), 637–666 (2006)
[24]
Bartholomew, D.J.: Time Series Analysis Forecasting and Control. JSTOR (1971)
[25]
Box G, Jenkins G, Reinsel G, and Ljung G Time Series Analysis: Forecasting and Control 2016 New Jersey John Willey and Sons
[26]
Hoerl AE and Kennard RW Ridge regression: Biased estimation for nonorthogonal problems Technometrics 1970 12 1 55-67
[27]
Vapnik, V., Golowich, S., Smola, A.: Support vector method for function approximation, regression estimation and signal processing. Advances in neural information processing systems 9 (1996)
[28]
Roberts S, Osborne M, Ebden M, Reece S, Gibson N, and Aigrain S Gaussian processes for time-series modelling Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 2013 371 1984 20110550
[29]
Cleveland RB, Cleveland WS, McRae JE, and Terpenning I Stl: A seasonal-trend decomposition J. Off. Stat 1990 6 1 3-73
[30]
Bloomfield, P.: Fourier Analysis of Time Series: an Introduction. John Wiley & Sons, (2004)
[31]
Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
[32]
Hochreiter S and Schmidhuber J Long short-term memory Neural Comput. 1997 9 8 1735-1780
[33]
Lai, G., Chang, W.-C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)
[34]
Temporal pattern attention for multivariate time series forecasting Shih, S.-Y., Sun, F.-K., Lee, H.-y Mach. Learn. 2019 108 1421-1441
[35]
Ullah, S., Xu, Z., Wang, H., Menzel, S., Sendhoff, B., Bäck, T.: Exploring clinical time series forecasting with meta-features in variational recurrent models. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–9 (2020). IEEE
[36]
Liu, Y., Wu, H., Wang, J., Long, M.: Non-stationary transformers: Rethinking the stationarity in time series forecasting. arXiv preprint arXiv:2205.14415 (2022)
[37]
Oreshkin, B.N., Carpov, D., Chapados, N., Bengio, Y.: N-beats: Neural basis expansion analysis for interpretable time series forecasting. arXiv preprint arXiv:1905.10437 (2019)
[38]
Challu, C., Olivares, K.G., Oreshkin, B.N., Ramirez, F.G., Canseco, M.M., Dubrawski, A.: Nhits: Neural hierarchical interpolation for time series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, pp. 6989–6997 (2023)
[39]
Zhang, T., Zhang, Y., Cao, W., Bian, J., Yi, X., Zheng, S., Li, J.: Less is more: Fast multivariate time series forecasting with light sampling-oriented mlp structures. arXiv preprint arXiv:2207.01186 (2022)
[40]
Hou, M., Xu, C., Li, Z., Liu, Y., Liu, W., Chen, E., Bian, J.: Multi-granularity residual learning with confidence estimation for time series prediction. In: Proceedings of the ACM Web Conference 2022, pp. 112–121 (2022)
[41]
Johnson, A.E., Pollard, T.J., Shen, L., Lehman, L.-w.H., Feng, M., Ghassemi, M., Moody, B., Szolovits, P., Anthony Celi, L., Mark, R.G.: Mimic-iii, a freely accessible critical care database. Scientific data 3( 1), 1–9 (2016)
[42]
Harutyunyan H, Khachatrian H, Kale DC, Ver Steeg G, and Galstyan A Multitask learning and benchmarking with clinical time series data Scientific Data 2019 6 1 96
[43]
Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L., et al.: Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems 32 (2019)
[44]
Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

Recommendations

Comments

Information & Contributors

Information

Published In

cover image World Wide Web
World Wide Web  Volume 27, Issue 4
Jul 2024
419 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 22 June 2024
Accepted: 06 June 2024
Revision received: 29 May 2024
Received: 26 December 2023

Author Tags

  1. Time series forecasting
  2. Multi-granularity learning
  3. Information de-redundancy
  4. Inductive bias extraction

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 01 Sep 2024

Other Metrics

Citations

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media