Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems

Published: 01 September 2010 Publication History

Abstract

It is shown that the steepest-descent and Newton's methods for unconstrained nonconvex optimization under standard assumptions may both require a number of iterations and function evaluations arbitrarily close to $O(\epsilon^{-2})$ to drive the norm of the gradient below $\epsilon$. This shows that the upper bound of $O(\epsilon^{-2})$ evaluations known for the steepest descent is tight and that Newton's method may be as slow as the steepest-descent method in the worst case. The improved evaluation complexity bound of $O(\epsilon^{-3/2})$ evaluations known for cubically regularized Newton's methods is also shown to be tight.

References

[1]
H. Behforooz, Thinning out the harmonic series, Math. Mag., 68 (1995), pp. 289-293.
[2]
C. Cartis, N. I. M. Gould, and Ph. L. Toint, Adaptive cubic regularisation methods for unconstrained optimization. Part I: Motivation, convergence and numerical results, Math. Program., Ser. A, (2009).
[3]
C. Cartis, N. I. M. Gould, and Ph. L. Toint, Adaptive cubic regularisation methods for unconstrained optimization. Part II: Worst-case function- and derivative-evaluation complexity, Math. Program., Ser. A, (2010a).
[4]
C. Cartis, N. I. M. Gould, and Ph. L. Toint, Efficiency of Adaptive Cubic Regularisation Methods on Convex Problems, Technical Report (in preparation), Department of Mathematics, FUNDP—University of Namur, Namur, Belgium, 2010b.
[5]
A. R. Conn, N. I. M. Gould, and Ph. L. Toint, Trust-Region Methods, MPS/SIAM Ser. Optim. 1, SIAM, Philadelphia, PA, 2000.
[6]
J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Prentice-Hall, Englewood Cliffs, NJ, 1983.
[7]
R. Fletcher, Practical Methods of Optimization, 2nd ed., J. Wiley & Sons, Chichester, England, 1987.
[8]
N. I. M. Gould, C. Sainvitu, and Ph. L. Toint, A filter-trust-region method for unconstrained optimization, SIAM J. Optim., 16 (2005), pp. 341-357.
[9]
S. Gratton, A. Sartenaer, and Ph. L. Toint, Recursive trust-region methods for multiscale nonlinear optimization, SIAM J. Optim., 19 (2008), pp. 414-444.
[10]
A. Griewank, The modification of Newton's method for unconstrained optimization by bounding cubic terms, Technical report NA/12, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, United Kingdom, 1981.
[11]
Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Appl. Optim. 87, Kluwer Academic Publishers, Dordrecht, The Netherlands, 2004.
[12]
Yu. Nesterov, Accelerating the cubic regularization of Newton's method on convex problems, Math. Program., Ser. B, 112 (2008), pp. 159-181.
[13]
Yu. Nesterov and B. T. Polyak, Cubic regularization of Newton method and its global performance, Math. Program., Ser. A, 108 (2006), pp. 177-205.
[14]
J. Nocedal and S. J. Wright, Numerical Optimization, Springer Ser. Oper. Res., Springer Verlag, Heidelberg, Berlin, New York, 1999.
[15]
R. Polyak, Regularized Newton method for unconstrained convex optimization, Math. Program., Ser. A, 120 (2009), pp. 125-145.
[16]
K. Ueda, Regularized Newton Method without Line Search for Unconstrained Optimization, Ph.D. thesis, Department of Applied Mathematics and Physics, Graduate School of Informatics, Kyoto University, Kyoto, Japan, 2009.
[17]
K. Ueda and N. Yamashita, A regularized Newton method without line search for unconstrained optimization, Technical report 2009-007, Department of Applied Mathematics and Physics, Graduate School of Informatics, Kyoto University, Kyoto, Japan, 2009.
[18]
K. Ueda and N. Yamashita, Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization, Appl. Math. Optim., 62 (2010), pp. 27-46.
[19]
M. Weiser, P. Deuflhard, and B. Erdmann, Affine conjugate adaptive Newton methods for nonlinear elastomechanics, Optim. Methods Softw., 22 (2007), pp. 413-431.

Cited By

View all
  • (2025)Perseus: a simple and optimal high-order method for variational inequalitiesMathematical Programming: Series A and B10.1007/s10107-024-02075-2209:1(609-650)Online publication date: 1-Jan-2025
  • (2024)Inexact Newton-type methods for optimisation with nonnegativity constraintsProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3693935(45835-45882)Online publication date: 21-Jul-2024
  • (2024)Challenges in training PINNsProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3693785(42159-42191)Online publication date: 21-Jul-2024
  • Show More Cited By
  1. On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image SIAM Journal on Optimization
      SIAM Journal on Optimization  Volume 20, Issue 6
      August 2010
      822 pages

      Publisher

      Society for Industrial and Applied Mathematics

      United States

      Publication History

      Published: 01 September 2010

      Author Tags

      1. Newton's method
      2. cubic regularization
      3. global complexity bounds
      4. global rate of convergence
      5. nonlinear optimization
      6. steepest-descent method
      7. trust-region methods
      8. unconstrained optimization

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 27 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Perseus: a simple and optimal high-order method for variational inequalitiesMathematical Programming: Series A and B10.1007/s10107-024-02075-2209:1(609-650)Online publication date: 1-Jan-2025
      • (2024)Inexact Newton-type methods for optimisation with nonnegativity constraintsProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3693935(45835-45882)Online publication date: 21-Jul-2024
      • (2024)Challenges in training PINNsProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3693785(42159-42191)Online publication date: 21-Jul-2024
      • (2024)A hybrid inexact regularized Newton and negative curvature methodComputational Optimization and Applications10.1007/s10589-024-00576-688:3(849-870)Online publication date: 1-Jul-2024
      • (2024)No dimension-free deterministic algorithm computes approximate stationarities of LipschitziansMathematical Programming: Series A and B10.1007/s10107-023-02031-6208:1-2(51-74)Online publication date: 6-Jan-2024
      • (2023)Quantum lower bounds for finding stationary points of nonconvex functionsProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3620138(41268-41299)Online publication date: 23-Jul-2023
      • (2023)Complexity of block coordinate descent with proximal regularization and applications to Wasserstein CP-dictionary learningProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3619156(18114-18134)Online publication date: 23-Jul-2023
      • (2022)A cubic regularization of Newton’s method with finite difference Hessian approximationsNumerical Algorithms10.1007/s11075-021-01200-y90:2(607-630)Online publication date: 1-Jun-2022
      • (2022)Lower bounds for non-convex stochastic optimizationMathematical Programming: Series A and B10.1007/s10107-022-01822-7199:1-2(165-214)Online publication date: 9-Jun-2022
      • (2021)On the convergence of step decay step-size for stochastic optimizationProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3541351(14226-14238)Online publication date: 6-Dec-2021
      • Show More Cited By

      View Options

      View options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media