Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
The theorem shows that any nonlinear functionals that can be stably approximated by general nonlinear RNNs must have an exponentially decaying memory, which confirms that the curse-of-memory phenomenon is not limited to the linear case. Numerical verifications are included to demonstrate the result.
Feb 6, 2024
May 30, 2023 · Abstract:We prove an inverse approximation theorem for the approximation of nonlinear sequence-to-sequence relationships using recurrent ...
Aug 22, 2023 · Abstract. We prove an inverse approximation theorem for the approximation of nonlinear sequence-to-sequence relationships using RNNs.
We study the approximation properties and optimization dynamics of recurrent neural networks (RNNs) when applied to learn input-output relationships in temporal ...
Inverse approximation theory for nonlinear recurrent neural networks. S Wang, Z Li, Q Li. International Conference on Learning Representations (ICLR) (Spotlight) ...
May 4, 2024 · We prove an inverse approximation theorem for the approximation of nonlinear sequence-to-sequence relationships using recurrent neural networks ...
Although recurrent models benefit from low inference costs, this curse restricts their effectiveness for tasks involving long sequences. In this paper, we study ...
May 30, 2023 · We prove an inverse approximation theorem for the approximation of nonlinear sequence-to-sequence relationships using RNNs.
Jun 22, 2024 · In the present paper we considered the problems of studying the best approximation order and inverse approximation theorems for families of ...