Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-3-031-20862-1_29guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

PLAE: Time-Series Prediction Improvement by Adaptive Decomposition

Published: 10 November 2022 Publication History

Abstract

Univariate time-series forecasting is a kind of commonly encountered yet tough problem. Most of the forecast algorithms’ performance is constrained by the limited information due to the single input dimension. No matter how capable a forecast algorithm is, an accurate output cannot be rendered on an unpredictable time-series. This paper presents PLAE (Predictability Leveraging Auto-Encoder), a Seq2Seq model for univariate time-series data aiming to enhance the accuracy of the given algorithm without dimensional adaptation. The main idea is decomposing the original input data into a group of more predictable microscopic time-series on which the forecast algorithm can deliver a more accurate output. And the final prediction is rendered by aggregating those components back to the original one-dimension. Experiments on three public data sets and one real-world data set show that PLAE can improve the forecast accuracy for 23.38% in terms of MAPE and 19.76% in terms of RMSE. Besides, experimental evidence shows that PLAE’s adaptive non-linear decomposition mechanism outperforms the pre-defined additive decomposition w.r.t. both forecasting performance and components’ interpretability.

References

[1]
Bandt C and Pompe B Permutation entropy: a natural complexity measure for time series Phys. Rev. Lett. 2002 88 17
[2]
Bao W, Yue J, and Rao Y A deep learning framework for financial time series using stacked autoencoders and long-short term memory PloS One 2017 12 7
[3]
Borovykh, A., Bohte, S., Oosterlee, C.W.: Conditional time series forecasting with convolutional neural networks. arXiv preprint arXiv:1703.04691 (2017)
[4]
Box GE, Jenkins GM, Reinsel GC, and Ljung GM Time Series Analysis: Forecasting and Control 2015 Hoboken Wiley
[5]
Chatfield C The holt-winters forecasting procedure J. Roy. Stat. Soc. Ser. C (Appl. Stat.) 1978 27 3 264-279
[6]
Chen KY and Wang CH A hybrid SARIMA and support vector machines in forecasting the production values of the machinery industry in Taiwan Expert Syst. Appl. 2007 32 1 254-264
[7]
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016)
[8]
Chen Y, Kang Y, Chen Y, and Wang Z Probabilistic forecasting with temporal convolutional neural network Neurocomputing 2020 399 491-501
[9]
Cleveland RB, Cleveland WS, McRae JE, and Terpenning I STL: a seasonal-trend decomposition J. Off. Stat 1990 6 1 3-73
[10]
Du, S., Li, T., Horng, S.J.: Time series forecasting using sequence-to-sequence deep learning framework. In: 2018 9th International Symposium on Parallel Architectures, Algorithms and Programming (PAAP), pp. 171–176. IEEE (2018)
[11]
Esling P and Agon C Time-series data mining ACM Comput. Surv. (CSUR) 2012 45 1 1-34
[12]
Ester, M., Kriegel, H.P., Sander, J., Xu, X., et al.: A density-based algorithm for discovering clusters in large spatial databases with noise. In: KDD, vol. 96, pp. 226–231 (1996)
[13]
Gardner ES Jr Exponential smoothing: the state of the art J. Forecast. 1985 4 1 1-28
[14]
Garland J, James R, and Bradley E Model-free quantification of time-series predictability Phys. Rev. E 2014 90 5
[15]
Huang, N.E., et al.: The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. Roy. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 454(1971), 903–995 (1998)
[16]
Ke, G., et al.: LightGBM: a highly efficient gradient boosting decision tree. In: Advances in Neural Information Processing Systems 30, pp. 3146–3154 (2017)
[17]
Kontoyiannis I, Algoet PH, Suhov YM, and Wyner AJ Nonparametric entropy estimation for stationary processes and random fields, with applications to English text IEEE Trans. Inf. Theory 1998 44 3 1319-1327
[18]
Kumar U and De Ridder K GARCH modelling in association with FFT-ARIMA to forecast ozone episodes Atmos. Environ. 2010 44 34 4252-4265
[19]
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems 32, pp. 5243–5253 (2019)
[20]
Liao M, Lyu P, He M, Yao C, and Bai X Mask TextSpotter: an end-to-end trainable neural network for spotting text with arbitrary shapes IEEE Trans. Pattern Anal. Mach. Intell. 2019 43 2 532-548
[21]
McLeod AI and Li WK Diagnostic checking ARMA time series models using squared-residual autocorrelations J. Time Ser. Anal. 1983 4 4 269-273
[22]
Molgedey L and Ebeling W Local order, entropy and predictability of financial time series Eur. Phys. J. B Condens. Matter Complex Syst. 2000 15 4 733-737
[23]
Navet N and Chen SH Brabazon A and O’Neill M On predictability and profitability: would GP induced trading rules be sensitive to the observed entropy of time series? Natural Computing in Computational Finance 2008 Heidelberg Springer 197-210
[24]
Pennekamp F et al. The intrinsic predictability of ecological time series and its potential to guide forecasting Ecol. Monogr. 2019 89 2
[25]
Salinas D, Flunkert V, Gasthaus J, and Januschowski T DeepAR: probabilistic forecasting with autoregressive recurrent networks Int. J. Forecast. 2020 36 3 1181-1191
[26]
Song C, Qu Z, Blumm N, and Barabási AL Limits of predictability in human mobility Science 2010 327 5968 1018-1021
[27]
Tang, Z., De Almeida, C., Fishwick, P.A.: Time series forecasting using neural networks vs. Box-Jenkins methodology. Simulation 57(5), 303–310 (1991)
[28]
Taylor SJ and Letham B Forecasting at scale Am. Stat. 2018 72 1 37-45
[29]
Teunter RH, Syntetos AA, and Babai MZ Intermittent demand: linking forecasting to inventory obsolescence Eur. J. Oper. Res. 2011 214 3 606-615
[30]
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30 (2017)
[31]
Wen M, Li P, Zhang L, and Chen Y Stock market trend prediction using high-order information of time series IEEE Access 2019 7 28299-28308
[32]
Xu P, Yin L, Yue Z, and Zhou T On predictability of time series Phys. A Stat. Mech. Appl. 2019 523 345-351
[33]
Yu, H.F., Rao, N., Dhillon, I.S.: Temporal regularized matrix factorization for high-dimensional time series prediction. In: Advances in Neural Information Processing Systems 29 (2016)
[34]
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)
[35]
Zhu, Z., et al.: MixSeq: connecting macroscopic time series forecasting with microscopic time series data. In: Advances in Neural Information Processing Systems 34 (2021)

Cited By

View all
  • (2023)Learning Gaussian mixture representations for tensor time series forecastingProceedings of the Thirty-Second International Joint Conference on Artificial Intelligence10.24963/ijcai.2023/231(2077-2085)Online publication date: 19-Aug-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
PRICAI 2022: Trends in Artificial Intelligence: 19th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2022, Shanghai, China, November 10–13, 2022, Proceedings, Part I
Nov 2022
615 pages
ISBN:978-3-031-20861-4
DOI:10.1007/978-3-031-20862-1
  • Editors:
  • Sankalp Khanna,
  • Jian Cao,
  • Quan Bai,
  • Guandong Xu

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 10 November 2022

Author Tags

  1. Time-series prediction
  2. Time-series decomposition
  3. Predictability measure
  4. Accuracy enhancement

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 01 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Learning Gaussian mixture representations for tensor time series forecastingProceedings of the Thirty-Second International Joint Conference on Artificial Intelligence10.24963/ijcai.2023/231(2077-2085)Online publication date: 19-Aug-2023

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media