Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (1)

Search Parameters:
Keywords = cross-timestep feature sharing

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 2090 KiB  
Article
Multi-Task Time Series Forecasting Based on Graph Neural Networks
by Xiao Han, Yongjie Huang, Zhisong Pan, Wei Li, Yahao Hu and Gengyou Lin
Entropy 2023, 25(8), 1136; https://doi.org/10.3390/e25081136 - 28 Jul 2023
Cited by 2 | Viewed by 2582
Abstract
Accurate time series forecasting is of great importance in real-world scenarios such as health care, transportation, and finance. Because of the tendency, temporal variations, and periodicity of the time series data, there are complex and dynamic dependencies among its underlying features. In time [...] Read more.
Accurate time series forecasting is of great importance in real-world scenarios such as health care, transportation, and finance. Because of the tendency, temporal variations, and periodicity of the time series data, there are complex and dynamic dependencies among its underlying features. In time series forecasting tasks, the features learned by a specific task at the current time step (such as predicting mortality) are related to the features of historical timesteps and the features of adjacent timesteps of related tasks (such as predicting fever). Therefore, capturing dynamic dependencies in data is a challenging problem for learning accurate future prediction behavior. To address this challenge, we propose a cross-timestep feature-sharing multi-task time series forecasting model that can capture global and local dynamic dependencies in time series data. Initially, the global dynamic dependencies of features within each task are captured through a self-attention mechanism. Furthermore, an adaptive sparse graph structure is employed to capture the local dynamic dependencies inherent in the data, which can explicitly depict the correlation between features across timesteps and tasks. Lastly, the cross-timestep feature sharing between tasks is achieved through a graph attention mechanism, which strengthens the learning of shared features that are strongly correlated with a single task. It is beneficial for improving the generalization performance of the model. Our experimental results demonstrate that our method is significantly competitive compared to baseline methods. Full article
(This article belongs to the Special Issue Machine Learning in Biomedical Data Analysis)
Show Figures

Figure 1

Back to TopTop