Time Series Forecasting with TiDE
Explore the architecture of TiDE and apply it in a forecasting project using Python
In our exploration of the latest advances in the field of time series forecasting, we discovered N-HiTS, PatchTST, TimeGPT and also TSMixer.
While many efforts have been deployed to apply the Transformer architecture for forecasting, it turns out that it achieves a mediocre performance considering the computation requirements.
In fact, simple linear models have been shown to outperform the complex Transformer-based models on many benchmark datasets (see Zheng et al., 2022).
Motivated by that, in April 2023, researchers at Google proposed TiDE: a long-term forecasting model with an encoder-decoder architecture built with Multilayer Perceptrons (MLPs).
In their paper Long-term Forecasting with TiDE: Time-series Dense Encoder, the authors demonstrate that the model achieves state-of-the-art results on numerous datasets when compared to other Transformer-based and MLP-based models, like PatchTST and N-HiTS respectively.
In this article, we first explore the architecture and inner workings of TiDE. Then, we apply the model in Python and use it in our own small forecasting experiment.