Abstract
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (L-BFGS) and a Hessian-free Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing an initial matrix for L-BFGS iterations. The lengths of the L-BFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the new algorithms are both effective and not sensitive to the choice of parameters.
Similar content being viewed by others
References
I. Bongartz, A.R. Conn, N.I.M. Gould, and Ph.L. Toint, “CUTE: Constrained and unconstrained testing environment,” ACM Trans. Math. Software, vol. 21, pp. 123–160, 1995.
R.H. Byrd, J. Nocedal, and C. Zhu, “Towards a discrete Newton method with memory for large scale optimization,” in Nonlinear Optimization and Applications, G. Di Pillo and F. Giannessi (Eds.), Plenum: New York, 1996, pp. 1–12.
A.R. Conn, N.I.M. Gould, and Ph.L. Toing, “LANCELOT: A Fortran package for large-scale nonlinear optimization (Release A),” no. 17 in Springer Series in Computational Mathematics, Springer-Verlag: New York, 1992.
R.S. Dembo and T. Steihaug, “Truncated-Newton algorithms for large-scale unconstrained optimization,” Math. Programming, vol. 26, pp. 190–212, 1983.
J.C. Gilbert and C. Lemaréchal, “Some numerical experiments with variable storage quasi-Newton algorithms,” Math. Programming, vol. 45, pp. 407–436, 1989.
C.-J. Lin and J.J. More, “Newton's method for large bound-constrained optimization problems,” SIAM J. Optim., vol. 9, pp. 1100–1127, 1999.
D.C. Liu and J. Nocedal, “On the limited memory BFGS method for large scale optimization,” Math. Programming, vol. 45, pp. 503–528, 1989.
J.L. Morales and J. Nocedal, “Algorithm PREQN: Fortran Subroutines for Preconditioning the Conjugate Gradient Method,” Technical Report OTC 99/02. Optimization Technology Center. Northwestern University, Evanston, IL, 1999, Also in ACM Transactions on Mathematical Software, vol. 27, no. 1, pp. 83–91, 2001.
J.L. Morales and J. Nocedal, “Automatic preconditioning by limited memory quasi-Newton updates,” SIAM, J. Optim., vol. 10, pp. 1079–1096, 2000.
S.G. Nash, “Preconditioning of truncated-Newton methods,” SIAM J. Sci. Statist. Comput., vol. 6, pp. 599–616, 1985.
S.G. Nash and J. Nocedal, “ANumerical study of the limited memory BFGS method and the truncated-Newton method for large-scale optimization,” SIAM J. Optim., vol. 1, pp. 358–372, 1991.
J. Nocedal, “Updating quasi-Newton matrices with limited storage,” Math. Computt., vol. 35, pp. 773–782, 1980.
J. Nocedal and S.J. Wright, Numerical Optimization, Springer Series in Operations Research, New York, 1999.
D.P. O'Leary, “A discrete Newton algorithm for minimizing a function of many variables,” Math. Programming, vol. 23, pp. 20–33, 1982.
D. Xie and T. Schlick, “Remark on the updated truncated Newton minimization package, Algorithm 702,” ACM Trans. Math. Software, vol. 25, pp. 108–122, 1999.
D. Xie and T. Schlick, “Efficient implementation of the truncated-Newton algorithm for large-scale chemistry applications,” SIAM J. Optim., vol. 10, pp. 132–154, 1999.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Morales, J.L., Nocedal, J. Enriched Methods for Large-Scale Unconstrained Optimization. Computational Optimization and Applications 21, 143–154 (2002). https://doi.org/10.1023/A:1013756631822
Issue Date:
DOI: https://doi.org/10.1023/A:1013756631822