Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Time-Series Forecasting: No, LSTMs Are Not Dead!

If they are dead, why do they still win Kaggle Competitions?

Nikos Kafritsas
Towards Data Science

--

Photo by Ricardo L on Unsplash

Who has been closely following the machine learning field during the past decade?

The people to do so have witnessed the revolutionary progress of science like no other. It is like the beginning of the 20th century, where Einstein’s Annus mirabilis papers became the foundation of quantum mechanics. Only this time, it was the AlexNet paper[1], an architecture that challenged computer vision and renewed people’s interest in machine learning (which was later transformed to Deep Learning).

The caveat of this relentless growth is that it’s difficult to correctly assess every breakthrough: Before a new feature is introduced and starts gaining ground, another one emerges — more powerful, faster, or cheaper. The tremendous growth creates so much hype that attracts many newcomers, often with much enthusiasm but little experience.

One such misunderstood breakthrough in the field of Deep Learning is the family of recurrent neural networks. If you google phrases such as “LSTMs are dead” and “RNNs have died” you will find a ton of results, most of which are incorrect or do not give the full picture. This article will show you that recurrent networks are still relevant and find use in many practical…

--

--

Data Scientist @ Persado || 🥇Top Writer in Artificial Intelligence and Time Series