Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Finding Foundation Models for Time Series Classification with a PreText Task

  • Conference paper
  • First Online:
Trends and Applications in Knowledge Discovery and Data Mining (PAKDD 2024)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14658))

Included in the following conference series:

Abstract

Over the past decade, Time Series Classification (TSC) has gained an increasing attention. While various methods were explored, deep learning – particularly through Convolutional Neural Networks (CNNs) –stands out as an effective approach. However, due to the limited availability of training data, defining a foundation model for TSC that overcomes the overfitting problem is still a challenging task. The UCR archive, encompassing a wide spectrum of datasets ranging from motion recognition to ECG-based heart disease detection, serves as a prime example for exploring this issue in diverse TSC scenarios. In this paper, we address the overfitting challenge by introducing pre-trained domain foundation models. A key aspect of our methodology is a novel pretext task that spans multiple datasets. This task is designed to identify the originating dataset of each time series sample, with the goal of creating flexible convolution filters that can be applied across different datasets. The research process consists of two phases: a pre-training phase where the model acquires general features through the pretext task, and a subsequent fine-tuning phase for specific dataset classifications. Our extensive experiments on the UCR archive demonstrate that this pre-training strategy significantly outperforms the conventional training approach without pre-training. This strategy effectively reduces overfitting in small datasets and provides an efficient route for adapting these models to new datasets, thus advancing the capabilities of deep learning in TSC.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ay, E., Devanne, M., Weber, J., Forestier, G.: A study of knowledge distillation in fully convolutional network for time series classification. In: International Joint Conference on Neural Networks (IJCNN) (2022)

    Google Scholar 

  2. Dau, H.A., et al.: The UCR time series archive. IEEE/CAA J. Automatica Sinica 6(6), 1293–1305 (2019)

    Article  Google Scholar 

  3. Dempster, A., Schmidt, D.F., Webb, G.I.: Hydra: competing convolutional kernels for fast and accurate time series classification. Data Min. Knowl. Discov. 1–27 (2023)

    Google Scholar 

  4. Ismail-Fawaz, A., Devanne, M., Berretti, S., Weber, J., Forestier, G.: Lite: light inception with boosting techniques for time series classification. In: International Conference on Data Science and Advanced Analytics (DSAA) (2023)

    Google Scholar 

  5. Ismail-Fawaz, A., et al.: An approach to multiple comparison benchmark evaluations that is stable under manipulation of the comparate set. arXiv preprint arXiv:2305.11921 (2023)

  6. Ismail-Fawaz, A., Devanne, M., Weber, J., Forestier, G.: Deep learning for time series classification using new hand-crafted convolution filters. In: IEEE International Conference on Big Data (IEEE BigData), pp. 972–981 (2022)

    Google Scholar 

  7. Ismail-Fawaz, A., Devanne, M., Weber, J., Forestier, G.: Enhancing time series classification with self-supervised learning. In: International Conference on Agents and Artificial Intelligence (ICAART) (2023)

    Google Scholar 

  8. Ismail-Fawaz, A., et al.: ShapeDBA: generating effective time series prototypes using shapeDTW barycenter averaging. In: ECML/PKDD Workshop on Advanced Analytics and Learning on Temporal Data (2023)

    Google Scholar 

  9. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.A.: Data augmentation using synthetic data for time series classification with deep residual networks. In: ECML/PKDD Workshop on Advanced Analytics and Learning on Temporal Data (2018)

    Google Scholar 

  10. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.A.: Transfer learning for time series classification. In: IEEE International Conference on Big Data (Big Data) (2018)

    Google Scholar 

  11. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.A.: Deep learning for time series classification: a review. Data Min. Knowl. Disc. 33(4), 917–963 (2019)

    Article  MathSciNet  Google Scholar 

  12. Ismail Fawaz, H., et al.: Inceptiontime: finding AlexNet for time series classification. Data Min. Knowl. Disc. 34(6), 1936–1962 (2020)

    Article  MathSciNet  Google Scholar 

  13. Middlehurst, M., Large, J., Flynn, M., Lines, J., Bostrom, A., Bagnall, A.: Hive-cote 2.0: a new meta ensemble for time series classification. Mach. Learn. 110(11-12), 3211–3243 (2021)

    Google Scholar 

  14. Middlehurst, M., Schäfer, P., Bagnall, A.: Bake off redux: a review and experimental evaluation of recent time series classification algorithms. arXiv preprint arXiv:2304.13029 (2023)

  15. Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: a strong baseline. In: International Joint Conference on Neural Networks (IJCNN) (2017)

    Google Scholar 

Download references

Acknowledgment

This work was supported by the ANR DELEGATION project (grant ANR-21-CE23-0014) of the French Agence Nationale de la Recherche. The authors would like to acknowledge the High Performance Computing Center of the University of Strasbourg for supporting this work by providing scientific support and access to computing resources. Part of the computing resources were funded by the Equipex Equip@Meso project (Programme Investissements d’Avenir) and the CPER Alsacalcul/Big Data. The authors would also like to thank the creators and providers of the UCR Archive.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ali Ismail-Fawaz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ismail-Fawaz, A., Devanne, M., Berretti, S., Weber, J., Forestier, G. (2024). Finding Foundation Models for Time Series Classification with a PreText Task. In: Wang, Z., Tan, C.W. (eds) Trends and Applications in Knowledge Discovery and Data Mining. PAKDD 2024. Lecture Notes in Computer Science(), vol 14658. Springer, Singapore. https://doi.org/10.1007/978-981-97-2650-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-2650-9_10

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-2649-3

  • Online ISBN: 978-981-97-2650-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics