Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Jan 11, 2023 · In this work we propose an adaptive proximal gradient method, adaPG, that uses novel estimates of the local smoothness modulus which leads to ...
Mar 13, 2024 · In this work we explore an alternative approach which can cope with nonsmooth formulations, does not require any backtracking procedures or ...
An adaptive proximal gradient method is proposed that uses novel estimates of the local smoothness modulus which leads to less conservative stepsize updates ...
Latafat, Themelis, Stella, Patrinos, Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient, arXiv:2301.04431 ( ...
In this paper, we explore two fundamental first-order algorithms in convex optimization, namely, gradient descent (GD) and proximal gradient method (ProxGD) ...
This paper proposes adaptive versions of GD and ProxGD that are based on observed gradient differences and, thus, have no added computational costs and ...
Jul 6, 2022 · We investigate an adaptive scheme for PANOC-type methods (Stella et al. in Proceedings of the IEEE 56th CDC, 2017), namely accelerated ...
People also ask
Abstract. In this paper we develop accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient (LLCG), which is beyond ...
We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions. Paper
May 10, 2023 · Patrinos, “Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient”, ... continuity of the gradient ...