Improving long-horizon forecasts with expectation-biased LSTM networks

AA Ismail, T Wood, HC Bravo - arXiv preprint arXiv:1804.06776, 2018 - arxiv.org
arXiv preprint arXiv:1804.06776, 2018arxiv.org
State-of-the-art forecasting methods using Recurrent Neural Net-works (RNN) based on
Long-Short Term Memory (LSTM) cells have shown exceptional performance targeting short-
horizon forecasts, eg given a set of predictor features, forecast a target value for the next few
time steps in the future. However, in many applica-tions, the performance of these methods
decays as the forecasting horizon extends beyond these few time steps. This paper aims to
explore the challenges of long-horizon forecasting using LSTM networks. Here, we illustrate …
State-of-the-art forecasting methods using Recurrent Neural Net- works (RNN) based on Long-Short Term Memory (LSTM) cells have shown exceptional performance targeting short-horizon forecasts, e.g given a set of predictor features, forecast a target value for the next few time steps in the future. However, in many applica- tions, the performance of these methods decays as the forecasting horizon extends beyond these few time steps. This paper aims to explore the challenges of long-horizon forecasting using LSTM networks. Here, we illustrate the long-horizon forecasting problem in datasets from neuroscience and energy supply management. We then propose expectation-biasing, an approach motivated by the literature of Dynamic Belief Networks, as a solution to improve long-horizon forecasting using LSTMs. We propose two LSTM ar- chitectures along with two methods for expectation biasing that significantly outperforms standard practice.
arxiv.org