Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Sequence Prediction Using Spectral RNNs

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2020 (ICANN 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12396))

Included in the following conference series:

Abstract

Fourier methods have a long and proven track record as an excellent tool in data processing. As memory and computational constraints gain importance in embedded and mobile applications, we propose to combine Fourier methods and recurrent neural network architectures. The short-time Fourier transform allows us to efficiently process multiple samples at a time. Additionally, weight reductions trough low pass filtering is possible. We predict time series data drawn from the chaotic Mackey-Glass differential equation and real-world power load and motion capture data. (Source code available at https://github.com/v0lta/Spectral-RNN).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    We adopt the common strategy of adding a small tolerance \(\epsilon =0.001\).

References

  1. NVIDIA Developer Fourier transforms in convolution. https://developer.nvidia.com/discover/convolution. Accessed 25 Mar 2020

  2. Alpay, T., Heinrich, S., Wermter, S.: Learning multiple timescales in recurrent neural networks. In: Villa, A.E.P., Masulli, P., Pons Rivero, A.J. (eds.) ICANN 2016. LNCS, vol. 9886, pp. 132–139. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-44778-0_16

    Chapter  Google Scholar 

  3. Arjovsky, M., Shah, A., Bengio, Y.: Unitary evolution recurrent neural networks. In: ICML (2016)

    Google Scholar 

  4. Bengio, Y., LeCun, Y., et al.: Scaling learning algorithms towards AI. Large-scale Kernel Mach. 34(5), 1–41 (2007)

    Google Scholar 

  5. Chan, W., Jaitly, N., Le, Q., Vinyals, O.: Listen, attend and spell: a neural network for large vocabulary conversational speech recognition. In: 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4960–4964. IEEE (2016)

    Google Scholar 

  6. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: EMNLP (2014)

    Google Scholar 

  7. Chung, J., Ahn, S., Bengio, Y.: Hierarchical multiscale recurrent neural networks. In: ICLR (2017)

    Google Scholar 

  8. D. Griffin, J.L.: Signal estimation from modified short-time Fourier transform. In: IEEE Transactions on Acoustics, Speech, and Signal Processing (1984)

    Google Scholar 

  9. Dieleman, S., Schrauwen, B.: End-to-end learning for music audio. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE (2014)

    Google Scholar 

  10. Gers, F.A., Eck, D., Schmidhuber, J.: Applying LSTM to time series predictable through time-window approaches. In: Tagliaferri, R., Marinaro, M. (eds.) Neural Nets WIRN Vietri-01, pp. 193–200. Springer, London (2002). https://doi.org/10.1007/978-1-4471-0219-9_20

    Chapter  Google Scholar 

  11. Godfrey, L.B., Gashler, M.S.: Neural decomposition of time-series data for effective generalization. IEEE Trans. Neural Netw. Learn. Syst. 29(7), 2973–2985 (2018). https://doi.org/10.1109/TNNLS.2017.2709324

    Article  MathSciNet  Google Scholar 

  12. Goto, H., Osana, Y.: Chaotic complex-valued associative memory with adaptive scaling factor independent of multi-values. In: Tetko, I.V., Kårková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11727, pp. 76–86. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30487-4_6

    Chapter  Google Scholar 

  13. Gröchenig, K.: Foundations of Time-frequency Analysis. Springer, Boston (2013). https://doi.org/10.1007/978-1-4612-0003-1

    Book  MATH  Google Scholar 

  14. Harris, F.J.: On the use of windows for harmonic analysis with the discrete fourier transform. Proc. IEEE 66(1), 51–83 (1978)

    Article  Google Scholar 

  15. Ionescu, C., Papava, D., Olaru, V., Sminchisescu, C.: Human3.6m: large scale datasets and predictive methods for 3D human sensing in natural environments. IEEE Trans. Pattern Anal. Mach. Intell. 36(7), 1325–1339 (2014)

    Google Scholar 

  16. Koutnik, J., Greff, K., Gomez, F., Schmidhuber, J.: A clockwork RNN. In: International Conference on Machine Learning, pp. 1863–1871 (2014)

    Google Scholar 

  17. Lin, T., Horne, B.G., Tino, P., Giles, C.L.: Learning long-term dependencies in NARX recurrent neural networks. IEEE Trans. Neural Netw. 7(6), 1329–1338 (1996)

    Article  Google Scholar 

  18. Mao, W., Liu, M., Salzmann, M., Li, H.: Learning trajectory dependencies for human motion prediction. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 9489–9497 (2019)

    Google Scholar 

  19. Martinez, J., Black, M., Romero, J.: On human motion prediction using recurrent neural networks. In: CVPR (2017)

    Google Scholar 

  20. Minami, K., Nakajima, H., Toyoshima, T.: Real-time discrimination of ventricular tachyarrhythmia with fourier-transform neural network. IEEE Trans. Biomed. Eng. 46(2), 179–185 (1999). https://doi.org/10.1109/10.740880

    Article  Google Scholar 

  21. Pintelon, R., Schoukens, J.: System Identification: A Frequency Domain Approach. Wiley, Hoboken (2012)

    Book  Google Scholar 

  22. Pratt, H., Williams, B., Coenen, F., Zheng, Y.: FCNN: fourier convolutional neural networks. In: Ceci, M., Hollmén, J., Todorovski, L., Vens, C., Džeroski, S. (eds.) ECML PKDD 2017. LNCS (LNAI), vol. 10534, pp. 786–798. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-71249-9_47

    Chapter  Google Scholar 

  23. Ryu, J., Yang, M.-H., Lim, J.: DFT-based transformation invariant pooling layer for visual classification. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) Computer Vision – ECCV 2018. LNCS, vol. 11218, pp. 89–104. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01264-9_6

    Chapter  Google Scholar 

  24. Serban, I.V., et al.: Multiresolution recurrent neural networks: an application to dialogue response generation. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)

    Google Scholar 

  25. Trabelsi, C., et al.: Deep complex networks. In: ICLR (2018)

    Google Scholar 

  26. Virtue, P., Yu, S., Lustig, M.: Better than real: complex-valued neural nets for MRI fingerprinting. In: ICIP (2017)

    Google Scholar 

  27. Wang, Z., Lan, Q., He, H., Zhang, C.: Winograd algorithm for 3D convolution neural networks. In: Lintas, A., Rovetta, S., Verschure, P.F.M.J., Villa, A.E.P. (eds.) ICANN 2017. LNCS, vol. 10614, pp. 609–616. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68612-7_69

    Chapter  Google Scholar 

  28. Wisdom, S., Powers, T., Hershey, J., Le Roux, J., Atlas, L.: Full-capacity unitary recurrent neural networks. In: NIPS (2016)

    Google Scholar 

  29. Wolter, M., Yao, A.: Complex gated recurrent neural networks. In: NIPS (2018)

    Google Scholar 

  30. Zuo, W., Zhu, Y., Cai, L.: Fourier-neural-network-based learning control for a class of nonlinear systems with flexible components. IEEE Trans. Neural Netw. 20(1), 139–151 (2008)

    Article  Google Scholar 

Download references

Acknowledgements

Work has been funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) YA 447/2-1 and GA 1927/4-1 (FOR2535 Anticipating Human Behavior) as well as by the National Research Foundation of Singapore under its NRF Fellowship Programme [NRF-NRFFAI1-2019-0001].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Moritz Wolter .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wolter, M., Gall, J., Yao, A. (2020). Sequence Prediction Using Spectral RNNs. In: Farkaš, I., Masulli, P., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2020. ICANN 2020. Lecture Notes in Computer Science(), vol 12396. Springer, Cham. https://doi.org/10.1007/978-3-030-61609-0_65

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-61609-0_65

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-61608-3

  • Online ISBN: 978-3-030-61609-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics