Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds

  • Published:
Mathematical Programming Submit manuscript

Abstract

There are many applications related to singly linearly constrained quadratic programs subjected to upper and lower bounds. In this paper, a new algorithm based on secant approximation is provided for the case in which the Hessian matrix is diagonal and positive definite. To deal with the general case where the Hessian is not diagonal, a new efficient projected gradient algorithm is proposed. The basic features of the projected gradient algorithm are: 1) a new formula is used for the stepsize; 2) a recently-established adaptive non-monotone line search is incorporated; and 3) the optimal stepsize is determined by quadratic interpolation if the non-monotone line search criterion fails to be satisfied. Numerical experiments on large-scale random test problems and some medium-scale quadratic programs arising in the training of Support Vector Machines demonstrate the usefulness of these algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    MATH  MathSciNet  Google Scholar 

  2. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10, 1196–1211 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  3. Birgin, E.G., Martínez, J.M., Raydan, M.: Inexact spectral projected gradient methods on convex sets. IMA J. Numer. Anal. 23, 539–559 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  4. Brucker, P.: An O(n) algorithm for quadratic knapsack problems. Oper. Res. Lett. 3, 163–166 (1984)

    Article  MATH  MathSciNet  Google Scholar 

  5. Calamai, P.H., Moré, J.J.: Quasi-Newton updates with bounds. SIAM J. Numer. Anal. 24, 1434–1441 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  6. Dai, Y.H., Fletcher, R.: On the asymptotic behaviour of some new gradient methods, Research report NA/212, Department of Mathematics, University of Dundee, 2003, and in Mathematical Programming, Series A Vol. 103, No. 3, 541–560 (2005)

  7. Dai, Y.H., Fletcher, R.: Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming, Research report NA/215, Department of Mathematics, University of Dundee, 2003, and in Numerische Mathematik A, Vol. 100, No. 1, 21–47 (2005)

  8. Dai, Y.H., Zhang, H.C.: An adaptive two-point stepsize gradient algorithm. Numer. Algorithms 27, 377–385 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  9. Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. Ser. A 91, 201–203 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  10. Fletcher, R.: On the Barzilai-Borwein method, Research report, Department of Mathematics, University of Dundee, 2001. and in Optimization and Control with Applications, Eds. L. Qi, K. Teo and X. Yang, Kluwer Academic Publishers, Series in Applied Optimization, (to appear).

  11. Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36, 275–289 (1999)

    Article  MathSciNet  Google Scholar 

  12. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton's method. SIAM J. Numer. Anal. 23, 707–716 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  13. Held, M., Wolfe, P., Crowder, H.: Validation of subgradient algorithms. Math. Prog. 6, 62–88 (1974)

    Article  MATH  MathSciNet  Google Scholar 

  14. Helgason, R., Kennington, J., Lall, H.: A polynomially bound algorithms for a singly constrained quadratic program. Math. Prog. 18, 338–343 (1980)

    Article  MATH  MathSciNet  Google Scholar 

  15. Joachims, T.: Making large-scale SVM learning practical. In: Schölkopf, B., Burges, C.J.C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning , MIT Press, Cambridge, Massachussets, 1998

  16. Meyer, R.R.: Multipoint methods for separable nonlinear networks. Math. Programming Study 22, 185–205 (1984)

    MATH  Google Scholar 

  17. Moré, J., Toraldo, G.: Algorithms for bound constrained quadratic programming problems. Numer. Math. 55, 377–400 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  18. Pardalos, P.M., Kovoor, N.: An algorithm for a singly constrained class of quadratic programs subject to upper and lower bounds. Math. Prog. 46, 321–328 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  19. Pardalos, P.M., Rosen, J.B.: Constrained global optimization: Algorithms and applications. In: Lecture Notes in Computer Science, Vol. 268 (Springer, Berlin, 1987)

  20. Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Advances in Kernel Methods - Support Vector Learning, Schölkopf, B., Burges, C., Smola, A. (eds.) MIT Press, Cambridge, Massachusets, 1998

  21. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)

    MATH  MathSciNet  Google Scholar 

  22. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  23. Serafini, T., Zanghirati, G., Zanni, L.: Gradient projection methods for quadratic programs and applications in training support vector machines. Research report, 2003 Optim. Meth. Software,Vol. 20, 347–372 (2005).

  24. Toint, Ph.L.: A non-monotone trust region algorithm for nonlinear optimization subject to convex constraints. Math. Prog. 77, 69–94 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  25. Vapnik, V.: Estimation of dependences based on empirical data. Springer-Verlag, 1982

  26. Zanni, L.: An improved gradient projection-based decomposition technique for support vector machines. Research report, Dipartimento di Mathematica, Università di Modena e Reggio Emilia, Italy

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu-Hong Dai.

Additional information

This work was supported by the EPRSC in UK (no. GR/R87208/01) and the Chinese NSF grants (no. 10171104 and 40233029).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dai, YH., Fletcher, R. New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds. Math. Program. 106, 403–421 (2006). https://doi.org/10.1007/s10107-005-0595-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-005-0595-2

Keywords