Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

An Adaptive Regularisation algorithm using Cubics (ARC) is proposed for unconstrained optimization, generalizing at the same time an unpublished method due to Griewank (Technical Report NA/12, 1981, DAMTP, University of Cambridge), an algorithm by Nesterov and Polyak (Math Program 108(1):177–205, 2006) and a proposal by Weiser et al. (Optim Methods Softw 22(3):413–431, 2007). At each iteration of our approach, an approximate global minimizer of a local cubic regularisation of the objective function is determined, and this ensures a significant improvement in the objective so long as the Hessian of the objective is locally Lipschitz continuous. The new method uses an adaptive estimation of the local Lipschitz constant and approximations to the global model-minimizer which remain computationally-viable even for large-scale problems. We show that the excellent global and local convergence properties obtained by Nesterov and Polyak are retained, and sometimes extended to a wider class of problems, by our ARC approach. Preliminary numerical experiments with small-scale test problems from the CUTEr set show encouraging performance of the ARC algorithm when compared to a basic trust-region implementation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Byrd R.H., Khalfan H.F., Schnabel R.B.: Analysis of a symmetric rank-one trust region method. SIAM J. Optim. 6(4), 1025–1039 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  2. Cartis, C., Gould, N.I.M., Toint, Ph.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity, 2007

  3. Conn A.R., Gould N.I.M., Toint Ph.L.: Convergence of quasi-Newton matrices generated by the symmetric rank one update. Math. Program. 50(2), 177–196 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  4. Conn A.R., Gould N.I.M., Toint Ph.L.: Trust-Region Methods. SIAM, Philadelphia (2000)

    Book  MATH  Google Scholar 

  5. Dembo R.S., Eisenstat S.C., Steihaug T.: Inexact-Newton methods. SIAM J. Numer. Anal. 19(2), 400–408 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  6. Dembo R.S., Steihaug T.: Truncated-Newton algorithms for large-scale unconstrained optimization. Math. Program. 26(2), 190–212 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  7. Dennis J.E., Moré J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 28(126), 549–560 (1974)

    Article  MATH  Google Scholar 

  8. Dennis, J.E., Schnabel, R.B.: Numerical methods for unconstrained optimization and nonlinear equations. Prentice-Hall, Englewood Cliffs, New Jersey, USA, 1983. Reprinted as Classics in Applied Mathematics, vol. 16. SIAM, Philadelphia, USA (1996)

  9. Deuflhard, P.: Newton Methods for Nonlinear Problems. Affine Invariance and Adaptive Algorithms. Springer Series in Computational Mathematics, vol. 35. Springer, Berlin (2004)

  10. Dolan E.D., Moré J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  11. Gerasoulis A.: A fast algorithm for the multiplication of generalized Hilbert matrices with vectors. Math. Comput. 50(181), 179–188 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  12. Gerasoulis A., Grigoriadis M.D., Sun L.: A fast algorithm for Trummer’s problem. SIAM J. Sci. Stat. Comput. 8(1), 135–138 (1987)

    Article  MathSciNet  Google Scholar 

  13. Golub G.H., Van Loan C.F.: Matrix Computations. The John Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  14. Gould N.I.M., Lucidi S., Roma M., Toint Ph.L.: Solving the trust-region subproblem using the Lanczos method. SIAM J. Optim. 9(2), 504–525 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gould N.I.M., Orban D., Sartenaer A., Toint Ph.L.: Sensitivity of trust-region algorithms on their parameters. 4OR Q. J. Ital. French Belg. Soc. 3(3), 227–241 (2005)

    MathSciNet  MATH  Google Scholar 

  16. Gould N.I.M., Orban D., Toint Ph.L.: CUTEr (and SifDec), a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  17. Griewank, A.: The modification of Newton’s method for unconstrained optimization by bounding cubic terms. Technical Report NA/12 (1981). Department of Applied Mathematics and Theoretical Physics, University of Cambridge (1981)

  18. Griewank A.: The “global” convergence of Broyden-like methods with a suitable line search. J. Aust. Math. Soc. Ser. B 28, 75–92 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  19. Griewank, A., Toint, Ph.L.: Numerical experiments with partially separable optimization problems. Numerical Analysis: Proceedings Dundee 1983. Lecture Notes in Mathematics vol. 1066, pp. 203–220. Springer, Heidelberg (1984)

  20. Moré, J.J.: Recent developments in algorithms and software for trust region methods. In: Bachem, A., Grötschel, M., Korte, B. Mathematical Programming: The State of the Art, pp. 258–287. Springer, Heidelberg (1983)

  21. Nesterov Yu.: Introductory Lectures on Convex Optimization. Kluwer Academic Publishers, Dordrecht (2004)

    MATH  Google Scholar 

  22. Nesterov Yu.: Accelerating the cubic regularization of Newton’s method on convex problems. Math. Program. 112(1), 159–181 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nesterov Yu., Polyak B.T.: Cubic regularization of Newton’s method and its global performance. Math. Program. 108(1), 177–205 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  24. Nocedal J., Wright S.J.: Numerical Optimization. Springer, New York (1999)

    Book  MATH  Google Scholar 

  25. Thomas, S.W.: Sequential estimation techniques for quasi-Newton algorithms. Ph.D. Thesis, Cornell University, Ithaca (1975)

  26. Weiser M., Deuflhard P., Erdmann B.: Affine conjugate adaptive Newton methods for nonlinear elastomechanics. Optim. Methods Softw. 22(3), 413–431 (2007)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Coralia Cartis.

Additional information

This work was supported by the EPSRC grant GR/S42170.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cartis, C., Gould, N.I.M. & Toint, P.L. Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results. Math. Program. 127, 245–295 (2011). https://doi.org/10.1007/s10107-009-0286-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-009-0286-5

Keywords

Mathematics Subject Classification (2000)