Abstract
We study conditions for convergence of a generalized subgradient algorithm in which a relaxation step is taken in a direction, which is a convex combination of possibly all previously generated subgradients. A simple condition for convergence is given and conditions that guarantee a linear convergence rate are also presented. We show that choosing the steplength parameter and convex combination of subgradients in a certain sense optimally is equivalent to solving a minimum norm quadratic programming problem. It is also shown that if the direction is restricted to be a convex combination of the current subgradient and the previous direction, then an optimal choice of stepsize and direction is equivalent to the Camerini—Fratta—Maffioli modification of the subgradient method.
Similar content being viewed by others
References
S. Agmon, “The relaxation method for linear inequalities,”Canadian Journal of Mathematics 6 (1954) 282–292.
U. Brännlund, “A convergent subgradient method based on the relaxation step,” in: U. Brännlund, “On relaxation methods for nonsmooth optimization,” Ph.D. Thesis, Department of Mathematics, Kungliga Tekniska Högskolan (Stockholm, 1993).
U. Brännlund, K.C. Kiwiel and P.O. Lindberg, “A descent proximal level bundle method for convex nondifferentiable optimization,”Operations Research Letters 17 (3) (1995) 121–126.
P.M. Camerini, L. Fratta and F. Maffioli, “On improving relaxation methods by modified gradient techniques,”Mathematical Programming Study 3 (1975) 26–34.
J.L. Goffin, “Nondifferentiable optimization and the relaxation method,” in: C.L. Lemaréchal and R. Mifflin, eds.,Nonsmooth Optimization (Pergamon, Oxford, 1977) pp. 31–49.
S. Kim and H. Ahn, “Convergence of a generalized subgradient method for nondifferentiable convex optimization,”Mathematical Programming 50 (1) (1991) 75–80.
K.C. Kiwiel, “An aggregate subgradient method for nonsmooth convex minimization,”Mathematical Programming 27 (3) (1983) 320–341.
K.C. Kiwiel,Methods of Descent for Nondifferentiable Optimization (Springer, Berlin, 1985).
K.C. Kiwiel, “The efficiency of subgradient projection methods for convex nondifferentiable optimization, part II: implementations and extensions,”SIAM Journal on Control and Optimization, to appear.
C. Lemaréchal, “Nondifferentiable optimization,” in: G.L. Nemhauser, A.H.G. Rinnooy Kan and M.J. Todd, eds.,Optimization, Handbooks in Operations Research and Management Science, Vol. 1 (North-Holland, Amsterdam, 1989) pp. 529–572.
C. Lemaréchal, A. Nemirovskii and Yu. Nesterov, “New variants of bundle methods,”Mathematical Programming 69 (1) (1995) 111–147.
M. Minoux,Mathematical Programming, Theory and Algorithms (Wiley, New York, 1986).
T. Motzkin and I.J. Schoenberg, “The relaxation method for linear inequalities,”Canadian Journal of Mathematics 6 (1954) 393–404.
B.T. Polyak, “Minimization of unsmooth functionals,”USSR Computational Mathematics and Mathematical Physics 9 (1969) 14–29.
S. Schaible, “Fractional programming. I, duality,”Management Science 22 (1976) 858–867.
N.Z. Shor,Minimization Methods for Non-Differentiable Functions (Springer, Berlin, 1985).
Author information
Authors and Affiliations
Additional information
Research supported by the Swedish Research Council for Engineering Sciences (TFR).
Rights and permissions
About this article
Cite this article
Brännlund, U. A generalized subgradient method with relaxation step. Mathematical Programming 71, 207–219 (1995). https://doi.org/10.1007/BF01585999
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF01585999