Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Dec 26, 2020 · We prove that variance reduction reduces the SFO complexity of adaptive mirror descent algorithms and thus accelerates their convergence. In ...
Aug 22, 2022 · We prove that variance reduction can reduce the gradient complexity of all adaptive mirror descent algorithms that satisfy a mild assumption and ...
We study the application of the variance reduction technique on general adaptive stochastic mirror descent algorithms in nonsmooth nonconvex optimization ...
We prove that variance reduction reduces the SFO complexity of most adaptive mirror descent algorithms and accelerates their convergence. In particular, our ...
People also ask
We propose a simple yet general- ized framework for variance reduced adaptive mirror descent algorithms named SVRAMD and provide its convergence analysis in ...
We study the idea of variance reduction applied to adaptive stochastic mirror descent algorithms in nonsmooth nonconvex finite-sum optimization problems.
We study the convergence of stochastic mirror descent and make explicit the tradeoffs between communication and variance reduction.
We propose a simple yet generalized framework for variance reduced adaptive mirror descent algorithms named SVRAMD and provide its convergence analysis.
We prove that variance reduction can reduce the gradient complexity of all adaptive mirror descent algorithms that satisfy a mild assumption and thus accelerate ...
It is proved that variance reduction reduce the gradient complexity of most adaptive mirror descent algorithms and boost their convergence, which implies ...