Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-3-319-70139-4_54guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Position-Based Content Attention for Time Series Forecasting with Sequence-to-Sequence RNNs

Published: 14 November 2017 Publication History
  • Get Citation Alerts
  • Abstract

    We propose here an extended attention model for sequence-to-sequence recurrent neural networks (RNNs) designed to capture (pseudo-)periods in time series. This extended attention model can be deployed on top of any RNN and is shown to yield state-of-the-art performance for time series forecasting on several univariate and multivariate time series.

    References

    [1]
    De Gooijer JG and Hyndman RJ 25 years of time series forecasting Int. J. Forecast. 2006 22 3 443-473
    [2]
    Bontempi G, Ben Taieb S, and Le Borgne Y-A Aufaure M-A and Zimányi E Machine learning strategies for time series forecasting Business Intelligence 2013 Heidelberg Springer 62-77
    [3]
    Graves, A.: Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850 (2013)
    [4]
    Weston, J., Chopra, S., Bordes, A.: Memory networks. CoRR abs/1410.3916 (2014)
    [5]
    Weston, J., Bordes, A., Chopra, S., Mikolov, T.: Towards AI-Complete question answering: A set of prerequisite toy tasks. CoRR abs/1502.05698 (2015)
    [6]
    Graves A, Wayne G, Reynolds M, Harley T, Danihelka I, Grabska-Barwinska A, Colmenarejo SG, Grefenstette E, Ramalho T, Agapiou J, Badia AP, Hermann KM, Zwols Y, Ostrovski G, Cain A, King H, Summerfield C, Blunsom P, Kavukcuoglu K, and Hassabis D Hybrid computing using a neural network with dynamic external memory Nature 2016 538 7626 471-476
    [7]
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
    [8]
    Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: NIPS, pp. 2692–2700 (2015)
    [9]
    Walker, G.: On periodicity in series of related terms. In: Proceedings of Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character, vol. 131, no. 818, pp. 518–532 (1931)
    [10]
    Slutzky E The summation of random causes as the source of cyclic processes Econometrica: J. Econometr. Soc. 1937 5 2 105-146
    [11]
    Box GE and Jenkins GM Some recent advances in forecasting and control J. Roy. Stat. Soc.: Ser. C (Appl. Stat.) 1968 17 2 91-109
    [12]
    Tiao GC and Box GE Modeling multiple time series with applications J. Am. Stat. Assoc. 1981 76 376 802-816
    [13]
    Sapankevych, N.I., Sankar, R.: Time series prediction using support vector machines: a survey (2009)
    [14]
    Creamer, G.G., Freund, Y.: Predicting performance and quantifying corporate governance risk for Latin American ADRs and banks (2004)
    [15]
    Kusiak A, Verma A, and Wei X A data-mining approach to predict influent quality Environ. Monit. Assess. 2013 185 3 2197-2210
    [16]
    Kane MJ, Price N, Scotch M, and Rabinowitz P Comparison of arima and random forest time series models for prediction of avian influenza H5N1 outbreaks BMC Bioinform. 2014 15 1 276
    [17]
    Connor, J., Atlas, L.E., Martin, D.R.: Recurrent networks and NARMA modeling. In: NIPS, pp. 301–308 (1991)
    [18]
    Giles CL, Lawrence S, and Tsoi AC Noisy time series prediction using recurrent neural networks and grammatical inference Mach. Learn. 2001 44 1–2 161-183
    [19]
    Jaeger H and Haas H Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication Science 2004 304 5667 78-80
    [20]
    Hsieh TJ, Hsiao HF, and Yeh WC Forecasting stock markets using wavelet transforms and recurrent neural networks: an integrated system based on artificial bee colony algorithm Appl. Soft Comput. 2011 11 2 2510-2525
    [21]
    Längkvist M, Karlsson L, and Loutfi A A review of unsupervised feature learning and deep learning for time-series modeling Pattern Recogn. Lett. 2014 42 11-24
    [22]
    Hochreiter S and Schmidhuber J Long short-term memory Neural Comput. 1997 9 8 1735-1780
    [23]
    Gers FA, Eck D, and Schmidhuber J Tagliaferri R and Marinaro M Applying LSTM to time series predictable through time-window approaches Neural Nets WIRN Vietri-01 2001 London Springer 669-676
    [24]
    Gers FA, Schraudolph NN, and Schmidhuber J Learning precise timing with LSTM recurrent networks J. Mach. Learn. Res. 2002 3 8 115-143
    [25]
    Ranzato, M., Szlam, A., Bruna, J., Mathieu, M., Collobert, R., Chopra, S.: Video (language) modeling: a baseline for generative models of natural videos. arXiv preprint arXiv:1412.6604 (2014)
    [26]
    Xingjian, S., Chen, Z., Wang, H., Yeung, D.Y., Wong, W.K., Woo, W.c.: Convolutional LSTM network: a machine learning approach for precipitation nowcasting. In: Advances in Neural Information Processing Systems, pp. 802–810 (2015)
    [27]
    Lipton, Z.C., Kale, D.C., Elkan, C., Wetzell, R.: Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)
    [28]
    Riemer, M., Vempaty, A., Calmon, F.P., Heath III., F.F., Hull, R., Khabiri, E.: Correcting forecasts with multifactor neural attention. In: Proceedings of The 33rd International Conference on Machine Learning, pp. 3010–3019 (2016)
    [29]
    Choi, E., Bahadori, M.T., Sun, J., Kulas, J., Schuetz, A., Stewart, W.: RETAIN: an interpretable predictive model for healthcare using reverse time attention mechanism. In: NIPS, pp. 3504–3512 (2016)
    [30]
    Schuster M and Paliwal KK Bidirectional recurrent neural networks IEEE Trans. Sig. Process. 1997 45 11 2673-2681
    [32]
    Hyndman, R., Khandakar, Y.: Automatic time series forecasting: the forecast package for R. J. Stat. Softw. 27(3), 1–22 (2008). https://www.jstatsoft.org/v027/i03
    [33]
    Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    Neural Information Processing: 24th International Conference, ICONIP 2017, Guangzhou, China, November 14–18, 2017, Proceedings, Part V
    Nov 2017
    942 pages
    ISBN:978-3-319-70138-7
    DOI:10.1007/978-3-319-70139-4
    • Editors:
    • Derong Liu,
    • Shengli Xie,
    • Yuanqing Li,
    • Dongbin Zhao,
    • El-Sayed M. El-Alfy

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 14 November 2017

    Author Tags

    1. Recurrent neural networks
    2. Attention model
    3. Time series

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 0
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0

    Other Metrics

    Citations

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media