Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

TIformer: A Transformer-Based Framework for Time-Series Forecasting with Missing Data

  • Conference paper
  • First Online:
Databases Theory and Applications (ADC 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15449))

Included in the following conference series:

  • 78 Accesses

Abstract

Long Sequence Time-series Forecasting (LSTF) is a fundamental problem with wide real-world applications. However, missing data in time series is ubiquitous, posing significant challenges and affecting the accuracy of long sequence time-series forecasting. To address this issue, we propose a comprehensive framework Time-series Imputation transformer, namely TIformer, for imputing missing data and subsequently conducting time series forecasting. There are two major modules of TIformer, the missing data imputation module and the time series forecasting module. In the first module, we employ data imputation methods to improve the data quality for downstream forecasting. In the second stage, we employ the transformer-based model for long-term time series prediction, which leverages the frequency information on time series. TIformer is an effective work on improving time series forecasting performance on scarce data. Through experiments on various datasets, we have demonstrated the powerful effectiveness of our method in improving prediction accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 74.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Yang, P., Wang, H., Zhang, Y., Qin, L., Zhang, W., Lin, X.: T3s: effective representation learning for trajectory similarity computation. In: 2021 IEEE 37th International Conference on Data Engineering (ICDE), pp. 2183–2188. IEEE (2021)

    Google Scholar 

  2. Yang, P., Wang, H., Lian, D., Zhang, Y., Qin, L., Zhang, W.: TMN: trajectory matching networks for predicting similarity. In: 2022 IEEE 38th International Conference on Data Engineering (ICDE), pp. 1700–1713. IEEE (2022)

    Google Scholar 

  3. Zhu, Y., Shasha, D.E.: StatStream: statistical monitoring of thousands of data streams in real time. In: VLDB, pp. 358–369 (2002)

    Google Scholar 

  4. Matsubara, Y., Sakurai, Y., van Panhuis, W.G., Faloutsos, C.: FUNNEL: automatic mining of spatially coevolving epidemics. In: ACM SIGKDD, pp. 105–114 (2014)

    Google Scholar 

  5. Alabadla, M., et al.: Systematic review of using machine learning in imputing missing values. IEEE Access 10, 44483–44502 (2022)

    Article  Google Scholar 

  6. Liu, H., Wang, Y., Chen, W.G.: Three-step imputation of missing values in condition monitoring datasets. IET Gener. Transm. Distrib. 14(16), 3288–3300 (2020)

    Article  MATH  Google Scholar 

  7. Liu, H., et al.: A nonlinear regression application via machine learning techniques for geomagnetic data reconstruction processing. IEEE Trans. Geosci. Remote Sens. 57(1), 128–140 (2019)

    Article  MATH  Google Scholar 

  8. Hamzah, F.B., Mohd Hamzah, F., Mohd Razali, S.F., Jaafar, O., Abdul Jamil, N.: Imputation methods for recovering streamflow observation: a methodological review. Cogent Environ. Sci. 6(1) (2020)

    Google Scholar 

  9. Avanzi, F., Zheng, Z., Coogan, A., Rice, R., Akella, R., Conklin, M.H.: Gap-filling snow-depth time-series with Kalman filtering-smoothing and expectation maximization: proof of concept using spatially dense wireless-sensor-network data. Cold Reg. Sci. Technol. 175, 103066 (2020)

    Article  Google Scholar 

  10. Kok, I., Ozdemir, S.: DeepMDP: a novel deep-learning based missing data prediction protocol for IoT. IEEE Internet Things J. 8(1), 232–243 (2021)

    Article  MATH  Google Scholar 

  11. Li, D., Li, L., Li, X., Ke, Z., Hu, Q.: Smoothed LSTM-AE: a spatio-temporal deep model for multiple time-series missing imputation. Neurocomputing 411, 351–363 (2020)

    Article  MATH  Google Scholar 

  12. Altman, N.S.: An introduction to kernel and nearest-neighbor nonparametric regression. Am. Stat. 46(3), 175–185 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  13. Yoon, J., Jordon, J., Schaar, M.: GAIN: missing data imputation using generative adversarial nets. In: ICML, pp. 5675–5684 (2018)

    Google Scholar 

  14. Allen, A., Li, W.: Generative adversarial denoising autoencoder for face completion (2016)

    Google Scholar 

  15. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: FEDformer: frequency enhanced decomposed transformer for long-term series forecasting. In: ICML (2022)

    Google Scholar 

  16. Rubin, D.B.: Inference and missing data. Biometrika 63(3), 581–592 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  17. Xia, J., et al.: Adjusted weight voting algorithm for random forests in handling missing values. Pattern Recogn. 69(1), 52–60 (2017)

    Article  MATH  Google Scholar 

  18. Little, R.J.A., Rubin, D.B.: Statistical Analysis with Missing Data, 2nd edn. (2019)

    Google Scholar 

  19. Mellenbergh, G.J.: Counteracting Methodological Errors in Behavioral Research (2019)

    Google Scholar 

  20. Vaswani, A., et al.: Attention is all you need. In: Neural Information Processing Systems (2017)

    Google Scholar 

  21. Zhou, H., et al.: Informer: beyond efficient transformer forlong sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)

    Google Scholar 

  22. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. In: Neural Information Processing Systems, vol. 34, pp. 22419–22430 (2021)

    Google Scholar 

  23. Zhang, Y., Yan, J.: Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: ICLR (2023)

    Google Scholar 

  24. Nie, Y., Nguyen, N.H., Sinthong, P., Kalagnanam, J.: A time series is worth 64 words: long-term forecasting with transformers. In: International Conference on Learning Representations (2023)

    Google Scholar 

  25. Liu, Y., et al.: itransformer: inverted transformers are effective for time series forecasting. In: The Twelfth International Conference on Learning Representations (2024)

    Google Scholar 

  26. Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, pp. 11121–11128 (2023)

    Google Scholar 

  27. Das, A., Kong, W., Leach, A., Mathur, S.K., Sen, R., Yu, R.: Long-term forecasting with TiDE: timeseries dense encoder. Trans. Mach. Learn. Res. (2023). ISSN: 2835-8856

    Google Scholar 

  28. Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., Xiao, Y.: MICN: multi-scale local and global context modeling for long-term series forecasting. In: The Eleventh International Conference on Learning Representations (2022)

    Google Scholar 

  29. Liu, M., et al.: SCINet: time series modeling and forecasting with sample convolution and interaction. In: Neural Information Processing Systems, vol. 35, pp. 5816–5828 (2022)

    Google Scholar 

  30. Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., Long, M.: TimesNet: temporal 2D-variation modeling for general time series analysis. In: International Conference on Learning Representations (2023)

    Google Scholar 

  31. Rüschendorf, L., Rachev, S.T.: A characterization of random variables with minimum L2-distance. J. Multivar. Anal. 32(1), 48–54 (1990)

    Article  MATH  Google Scholar 

  32. Zou, B., Umugwaneza, M.P.: Shape-based trademark retrieval using cosine distance method. In: 2008 Eighth International Conference on Intelligent Systems Design and Applications, vol. 2, pp. 498–504 (2008)

    Google Scholar 

  33. Chen, Y., et al.: ImDiffusion: imputed diffusion models for multivariate time series anomaly detection. In: VLDB (2023)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yufan Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ding, Z., Chen, Y., Wang, H., Wang, X., Zhang, W., Zhang, Y. (2025). TIformer: A Transformer-Based Framework for Time-Series Forecasting with Missing Data. In: Chen, T., Cao, Y., Nguyen, Q.V.H., Nguyen, T.T. (eds) Databases Theory and Applications. ADC 2024. Lecture Notes in Computer Science, vol 15449. Springer, Singapore. https://doi.org/10.1007/978-981-96-1242-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-981-96-1242-0_6

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-96-1241-3

  • Online ISBN: 978-981-96-1242-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics