Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3320288.3320303acmotherconferencesArticle/Chapter ViewAbstractPublication PagesniceConference Proceedingsconference-collections
research-article

Analysis of Wide and Deep Echo State Networks for Multiscale Spatiotemporal Time Series Forecasting

Published: 26 March 2019 Publication History

Abstract

Echo state networks are computationally lightweight reservoir models inspired by the random projections observed in cortical circuitry. As interest in reservoir computing has grown, networks have become deeper and more intricate. While these networks are increasingly applied to nontrivial forecasting tasks, there is a need for comprehensive performance analysis of deep reservoirs. In this work, we study the influence of partitioning neurons given a budget and the effect of parallel reservoir pathways across different datasets exhibiting multi-scale and nonlinear dynamics.

References

[1]
Martín Abadi, Ashish Agarwal, Paul Barham, et al. 2015. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. CoRR abs/1603.04467 (2015). arXiv:1603.04467 http://arxiv.org/abs/1603.04467 - software available from tensorflow.org.
[2]
Pau Vilimelis Aceituno, Yan Gang, and Yang-Yu Liu. 2017. Tailoring Artificial Neural Networks for Optimal Learning. CoRR abs/1707.02469 (2017), 1--22. arXiv:cs/1707.02469 http://arxiv.org/abs/1707.02469
[3]
Mert Bay and Andreas F. Ehmannnd J. Stephen Downie. 2009. Evaluation of Multiple-F0 Estimation and Tracking Systems. In Proceedings of the 10th International Society for Music Information Retrieval Conference, ISMIR, Keiji Hirata, George Tzanetakis, and Kazuyoshi Yoshii (Eds.). International Society for Music Information Retrieval, Kobe International Conference Center, Kobe, Japan, 315--320. http://ismir2009.ismir.net/proceedings/PS2-21.pdf
[4]
Nicolas Boulanger-Lewandowski, Yoshua Bengio, and Pascal Vincent. 2012. Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription. In Proceedings of the 29th International Conference on Machine Learning, ICML (ICML'12). Omnipress, Edinburgh, Scotland, UK, 1881--1888. http://icml.cc/2012/papers/590.pdf
[5]
John B. Butcher, David Verstraeten, Benjamin Schrauwen, Charles R. Day, and Peter W. Haycock. 2013. Reservoir computing and extreme learning machines for non-linear time-series data analysis. Neural Networks 38 (2013), 76--89.
[6]
Zachariah Carmichael, Humza Syed, Stuart Burtner, and Dhireesha Kudithipudi. 2018. Mod-DeepESN: Modular Deep Echo State Network. Conference on Cognitive Computational Neuroscience abs/1808.00523 (Sept. 2018), 1--4. arXiv:cs/1808.00523 http://arxiv.org/abs/1808.00523 or https://ccneuro.org/2018/proceedings/1239.pdf.
[7]
François Chollet et al. 2015. Keras. https://github.com/fchollet/keras.
[8]
Paul Dean, John Porrill, and James V. Stone. 2002. Decorrelation control by the cerebellum achieves oculomotor plant compensation in simulated vestibulo-ocular reflex. The Royal Society 269, 1503 (2002), 1895--1904.
[9]
Jean-Pierre Eckmann and David Ruelle. 1985. Ergodic theory of chaos and strange attractors. Reviews of Modern Physics 57 (July 1985), 617--656. Issue 3.
[10]
Claudio Gallicchio and Alessio Micheli. 2011. Architectural and Markovian factors of echo state networks. Neural Networks 24, 5 (2011), 440--456.
[11]
Claudio Gallicchio and Alessio Micheli. 2017. Echo State Property of Deep Reservoir Computing Networks. Cognitive Computation 9, 3 (2017), 337--350.
[12]
Claudio Gallicchio, Alessio Micheli, and Luca Pedrelli. 2017. Deep reservoir computing: A critical experimental analysis. Neurocomputing 268 (2017), 87--99.
[13]
Claudio Gallicchio, Alessio Micheli, and Luca Pedrelli. 2018. Deep Echo State Networks for Diagnosis of Parkinson's Disease. In Proceedings of the 26th European Symposium on Artificial Neural Networks, ESANN. i6doc.com, Bruges, Belgium, 397--402. http://www.elen.ucl.ac.be/Proceedings/esann/esannpdf/es2018-163.pdf
[14]
Claudio Gallicchio, Alessio Micheli, and Luca Pedrelli. 2018. Design of deep echo state networks. Neural Networks 108 (2018), 33--47.
[15]
Claudio Gallicchio, Alessio Micheli, and Luca Silvestri. 2018. Local Lyapunov Exponents of Deep Echo State Networks. Neurocomputing 298 (2018), 34--45.
[16]
Thomas E. Gibbons. 2010. Unifying quality metrics for reservoir networks. In Proceedings of the International Joint Conference on Neural Networks, IJCNN. IEEE, Barcelona, Spain, 1--7.
[17]
Xavier Glorot and Yoshua Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, AISTATS (JMLR Proceedings), Yee Whye Teh and D. Mike Titterington (Eds.), Vol. 9. JMLR.org, Chia Laguna Resort, Sardinia, Italy, 249--256. http://jmlr.org/proceedings/papers/v9/glorot10a.html
[18]
Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural Computation 9, 8 (1997), 1735--1780.
[19]
John D. Hunter. 2007. Matplotlib: A 2D Graphics Environment. Computing in Science & Engineering 9, 3 (2007), 90--95.
[20]
Rob J. Hyndman and Yangzhuoran Yang. 2018. Daily minimum temperatures in Melbourne, Australia (1981--1990). https://pkg.yangzhuoranyang.com/tsdl/
[21]
Paul Jaccard. 1912. The Distribution of the Flora in the Alpine Zone. The New Phytologist 11, 2 (Feb. 1912), 37--50.
[22]
Herbert Jaeger. 2001. The "Echo State" Approach to Analysing and Training Recurrent Neural Networks-with an Erratum Note. Technical Report 148. Fraunhofer Institute for Autonomous Intelligent Systems, GMD-German National Research Institute for Information Technology. http://www.faculty.jacobs-university.de/hjaeger/pubs/EchoStatesTechRep.pdf
[23]
Herbert Jaeger. 2002. Short term memory in echo state networks. Technical Report 152. Fraunhofer Institute for Autonomous Intelligent Systems, GMD-German National Research Institute for Information Technology. http://www.faculty.jacobs-university.de/hjaeger/pubs/STMEchoStatesTechRep.pdf
[24]
Herbert Jaeger, Mantas Lukoševičius, Dan Popovici, and Udo Siewert. 2007. Optimization and applications of echo state networks with leaky-integrator neurons. Neural networks 20, 3 (2007), 335--352.
[25]
Eric Jones, Travis Oliphant, Pearu Peterson, et al. 2001. SciPy: Open Source Scientific Tools for Python. http://www.scipy.org/
[26]
James Kennedy and Russell C. Eberhart. 1995. Particle swarm optimization. In Proceedings of the International Conference on Neural Networks, ICNN'95 (1995), Vol. 4. IEEE, Perth, WA, Australia, 1942--1948.
[27]
Aleksandr M. Lyapunov. 1992. The general problem of the stability of motion. Internat. J. Control 55, 3 (1992), 531--534.
[28]
Thomas Lymburn, Alexander Khor, Thomas Stemler, et al. 2019. Consistency in Echo-State Networks. Chaos: An Interdisciplinary Journal of Nonlinear Science 29, 2 (2019), 23118.
[29]
Qianli Ma, Lifeng Shen, and Garrison W. Cottrell. 2017. Deep-ESN: A Multiple Projection-encoding Hierarchical Reservoir Computing Framework. CoRR abs/1711.05255 (2017), 15. arXiv:cs/1711.05255 http://arxiv.org/abs/1711.05255
[30]
Wolfgang Maass, Thomas Natschläger, and Henry Markram. 2002. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations. Neural Computation 14, 11 (2002), 2531--2560.
[31]
Michael C. Mackey and Leon Glass. 1977. Oscillation and chaos in physiological control systems. Science 197, 4300 (1977), 287--289.
[32]
Zeeshan K. Malik, Amir Hussain, and Qingming J. Wu. 2016. Multilayered Echo State Machine: A Novel Architecture and Algorithm. IEEE Transactions on Cybernetics 47, 4 (June 2016), 946--959.
[33]
Wes McKinney. 2010. Data Structures for Statistical Computing in Python. In Proceedings of the 9th Python in Science Conference (2010), Stéfan van der Walt and Jarrod Millman (Eds.). SciPy, Austin, TX, 51--56.
[34]
Lester James V. Miranda. 2018. PySwarms, a Research-Toolkit for Particle Swarm Optimization in Python. Journal of Open Source Software 3, 21 (2018), 433.
[35]
Travis E. Oliphant. 2006. A Guide to NumPy. Vol. 1. Trelgol Publishing, USA.
[36]
Nobuyuki Otsu. 1979. AThreshold Selection Method from Gray-Level Histograms. IEEE Transactions on Systems, Man, and Cybernetics 9, 1 (Jan. 1979), 62--66.
[37]
Mustafa C. Ozturk, Dongming Xu, and José Carlos Príncipe. 2007. Analysis and Design of Echo State Networks. Neural Computation 19, 1 (2007), 111--138.
[38]
Karl Pearson and Francis Galton. 1895. VII. Note on regression and inheritance in the case of two parents. Proceedings of the Royal Society of London 58, 347-352 (1895), 240--242.
[39]
Graham E. Poliner and Daniel P. W. Ellis. 2006. A Discriminative Model for Polyphonic Piano Transcription. EURASIP Journal on Advances in Signal Processing 2007, 1 (2006), 48317.
[40]
Benjamin Schrauwen, Marion Wardermann, David Verstraeten, Jochen J. Steil, and Dirk Stroobandt. 2008. Improving reservoirs using intrinsic plasticity. Neurocomputing 71, 7-9 (2008), 1159--1171.
[41]
Gordon M. Shepherd. 1990. The Synaptic Organization of the Brain (3rd ed.). Oxford University Press, New York, NY, USA.
[42]
Nicholas M. Soures. 2017. Deep Liquid State Machines with Neural Plasticity and On-Device Learning. Master's thesis. Rochester Institute of Technology.
[43]
Nicholas M. Soures, Lydia Hays, Eric Bohannon, Abdullah M. Zyarah, and Dhireesha Kudithipudi. 2017. On-Device STDP and Synaptic Normalization for Neuromemristive Spiking Neural Network. In Proceedings of the 60th International Midwest Symposium on Circuits and Systems (MWSCAS). IEEE, Boston, MA, USA, 1081--1084.
[44]
Nicholas M. Soures, Lydia Hays, and Dhireesha Kudithipudi. 2017. Robustness of a memristor based liquid state machine. In Proceedings of the International Joint Conference on Neural Networks, IJCNN. IEEE, Anchorage, AK, USA, 2414--2420.
[45]
Michael Waskom, Olga Botvinnik, Drew O'Kane, et al. 2018. Mwaskom/Seaborn: V0.9.0 (July 2018). zenodo V0.9.0 (2018), 1.
[46]
Tadashi Yamazaki and Shigeru Tanaka. 2007. The cerebellum as a liquid state machine. Neural Networks 20, 3 (2007), 290--297.
[47]
Mohd-Hanif Yusoff, Joseph Chrol-Cannon, and Yaochu Jin. 2016. Modeling neural plasticity in echo state networks for classification and regression. Information Sciences 364-365 (2016), 184--196.

Cited By

View all
  • (2022)Deep Reservoir Computing Based Random Vector Functional Link for Non-sequential Classification2022 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN55064.2022.9892228(1-8)Online publication date: 18-Jul-2022
  • (2020)Deep Echo State Networks with Multi-Span Features for Nonlinear Time Series Prediction2020 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN48605.2020.9207401(1-9)Online publication date: Jul-2020
  • (2019)Stochastic Tucker-Decomposed Recurrent Neural Networks for Forecasting2019 IEEE Global Conference on Signal and Information Processing (GlobalSIP)10.1109/GlobalSIP45357.2019.8969554(1-5)Online publication date: Nov-2019

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
NICE '19: Proceedings of the 7th Annual Neuro-inspired Computational Elements Workshop
March 2019
109 pages
ISBN:9781450361231
DOI:10.1145/3320288
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

In-Cooperation

  • INTEL: Intel Corporation
  • IBM: IBM, USA

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Echo state networks (ESNs)
  2. recurrent neural networks
  3. reservoir computing
  4. time series forecasting

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

NICE '19

Acceptance Rates

NICE '19 Paper Acceptance Rate 25 of 40 submissions, 63%;
Overall Acceptance Rate 25 of 40 submissions, 63%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Deep Reservoir Computing Based Random Vector Functional Link for Non-sequential Classification2022 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN55064.2022.9892228(1-8)Online publication date: 18-Jul-2022
  • (2020)Deep Echo State Networks with Multi-Span Features for Nonlinear Time Series Prediction2020 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN48605.2020.9207401(1-9)Online publication date: Jul-2020
  • (2019)Stochastic Tucker-Decomposed Recurrent Neural Networks for Forecasting2019 IEEE Global Conference on Signal and Information Processing (GlobalSIP)10.1109/GlobalSIP45357.2019.8969554(1-5)Online publication date: Nov-2019

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media