Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
We propose an efficient design of Transformer-based models for multivariate time series forecasting and self-supervised representation learning.
Patchtst huggingface from huggingface.co
Feb 1, 2024 · In this blog, we provide examples of how to get started with PatchTST. We first demonstrate the forecasting capability of PatchTST on the ...
Patchtst huggingface from huggingface.co
Jul 19, 2024 · PatchTST is a transformer-based model for time series modeling tasks, including forecasting, regression, and classification.
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Our channel-independent patch time series Transformer (PatchTST) can improve the long-term forecasting accuracy significantly when compared with that of SOTA ...
Our channel-independent patch time series Transformer (PatchTST) can improve the long-term forecasting accuracy significantly when compared with that of SOTA ...
Test PatchTST. For the official pretrained PatchTST, please see https://huggingface.co/ibm-granite/granite-timeseries-patchtst. Downloads last month: 15.
PatchTST model according to the specified arguments, defining the model architecture. [ibm/patchtst](https://huggingface.co/ibm/patchtst) architecture.
This model is a fine-tuned version of on an unknown dataset. Model description More information needed Intended uses & limitations More information needed