Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
This paper introduces two new concepts that could help resurrect it: object repre(cid:173) sentation by continuous attractors, and learning attractors by ...
To perform the task of filling in missing in- formation, the network develops a continuous attractor that models the manifold from which the patterns are drawn.
To perform the task of filling in missing information, the network develops a continuous attractor that models the manifold from which the patterns are drawn.
Recurrent neural networks (RNNs) may possess continuous attractors, a property that many brain theories have implicated in learning and memory.
Apr 3, 2024 · Here, we study how recurrent networks of binary neurons learn sequence attractors to store predefined pattern sequences and retrieve them robustly.
Representations of continuous attractors of recurrent neural networks · Continuous attractors of Lotka-Volterra recurrent neural networks with infinite neurons.
Here we adopt a new method for identifying the fixed points (both stored and false memory patterns) learned by attractor networks in general.
Missing: Recurrent | Show results with:Recurrent
Dec 1, 1997 · To perform the task of filling in missing information, the network develops a continuous attractor that models the manifold from which the ...
People also ask
Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges.
Apr 5, 2022 · We developed a theory for manifold attractors in trained neural networks, which approximates a continuum of persistent states, without assuming unrealistic ...