Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleJanuary 2022
Accelerating adaptive cubic regularization of Newton's method via random sampling
The Journal of Machine Learning Research (JMLR), Volume 23, Issue 1Article No.: 90, Pages 3904–3941In this paper, we consider an unconstrained optimization model where the objective is a sum of a large number of possibly nonconvex functions, though overall the objective is assumed to be smooth and convex. Our bid to solving such model uses the ...
- research-articleJanuary 2022
Solving the Cubic Regularization Model by a Nested Restarting Lanczos Method
SIAM Journal on Matrix Analysis and Applications (SIMAX), Volume 43, Issue 2Pages 812–839https://doi.org/10.1137/21M1436324As a variant of the classical trust-region method for unconstrained optimization, the cubic regularization of the Newton method introduces a cubic regularization term in the surrogate objective to adaptively adjust the updating step and deals with cases ...
- research-articleJanuary 2020
First-Order Methods for Nonconvex Quadratic Minimization
We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions and ...
- research-articleJanuary 2019
On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition
SIAM Journal on Optimization (SIOPT), Volume 29, Issue 1Pages 904–932https://doi.org/10.1137/18M1167498In this paper we consider the cubic regularization (CR) method, a regularized version of the classical Newton method, for minimizing a twice continuously differentiable function. While it is well known that the CR method is globally convergent and enjoys ...
- research-articleJanuary 2019
Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
SIAM Journal on Optimization (SIOPT), Volume 29, Issue 3Pages 2146–2178https://doi.org/10.1137/17M1113898We consider the minimization of a nonconvex quadratic form regularized by a cubic term, which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove that, under mild assumptions, gradient descent approximates the global minimum to ...
- articleSeptember 2017
An affine covariant composite step method for optimization with PDEs as equality constraints
Optimization Methods & Software (OPMS), Volume 32, Issue 5Pages 1132–1161https://doi.org/10.1080/10556788.2016.1241783We propose a composite step method, designed for equality constrained optimization with partial differential equations. Focus is laid on the construction of a globalization scheme, which is based on cubic regularization of the objective and an affine ...
- articleSeptember 2016
A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques
Optimization Methods & Software (OPMS), Volume 31, Issue 5Pages 1008–1035https://doi.org/10.1080/10556788.2016.1155213In recent years, cubic regularization algorithms for unconstrained optimization have been defined as alternatives to trust-region and line search schemes. These regularization techniques are based on the strategy of computing an approximate global ...
- articleSeptember 2010
On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
SIAM Journal on Optimization (SIOPT), Volume 20, Issue 6Pages 2833–2852https://doi.org/10.1137/090774100It is shown that the steepest-descent and Newton's methods for unconstrained nonconvex optimization under standard assumptions may both require a number of iterations and function evaluations arbitrarily close to $O(\epsilon^{-2})$ to drive the norm of ...