Abstract
Nemirovski’s analysis (SIAM J. Optim. 15:229–251, 2005) indicates that the extragradient method has the O(1/t) convergence rate for variational inequalities with Lipschitz continuous monotone operators. For the same problems, in the last decades, a class of Fejér monotone projection and contraction methods is developed. Until now, only convergence results are available to these projection and contraction methods, though the numerical experiments indicate that they always outperform the extragradient method. The reason is that the former benefits from the ‘optimal’ step size in the contraction sense. In this paper, we prove the convergence rate under a unified conceptual framework, which includes the projection and contraction methods as special cases and thus perfects the theory of the existing projection and contraction methods. Preliminary numerical results demonstrate that the projection and contraction methods converge twice faster than the extragradient method.





Similar content being viewed by others
Notes
For convenience, we only consider the distance function in the Euclidean-norm. All the results in this paper are easy to extended to the contraction of the distance function in G-norm where G is a positive definite matrix.
A similar type of (small) problems was tested in [21] where the components of the nonlinear mapping D(u) are D j (u)=c⋅arctan(u j ).
In the paper by Harker and Pang [4], the matrix M=A T A+B+D, where A and B are the same matrices as what we use here, and D is a diagonal matrix with uniformly distributed random entries d jj ∈(0.0,0.3).
In [4], the similar problems in the first set are called easy problems while the 2-nd set problems are called hard problems.
References
Bertsekas, D.P., Tsitsiklis, J.N.: Parallel and Distributed Computation, Numerical Methods. Prentice-Hall, Englewood Cliffs (1989)
Blum, E., Oettli, W.: Mathematische Optimierung: Grundlagen und Verfahren. Ökonometrie und Unternehmensforschung. Springer, Berlin (1975)
Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems, Vols. I and II. Springer Series in Operations Research. Springer, New York (2003)
Harker, P.T., Pang, J.S.: A damped-Newton method for the linear complementarity problem. Lect. Appl. Math. 26, 265–284 (1990)
He, B.S.: A class of projection and contraction methods for monotone variational inequalities. Appl. Math. Optim. 35, 69–76 (1997)
He, B.S., Liao, L.-Z.: Improvements of some projection methods for monotone nonlinear variational inequalities. J. Optim. Theory Appl. 112, 111–128 (2002)
He, B.S., Xu, M.-H.: A general framework of contraction methods for monotone variational inequalities. Pac. J. Optim. 4, 195–212 (2008)
He, B.S., Yuan, X.M., Zhang, J.J.Z.: Comparison of two kinds of prediction-correction methods for monotone variational inequalities. Comput. Optim. Appl. 27, 247–267 (2004)
He, B.S., Liao, L.-Z., Wang, X.: Proximal-like contraction methods for monotone variational inequalities in a unified framework I: effective quadruplet and primary methods. Comput. Optim. Appl. 51, 649–679 (2012)
He, B.S., Liao, L.-Z., Wang, X.: Proximal-like contraction methods for monotone variational inequalities in a unified framework II: general methods and numerical experiments. Comput. Optim. Appl. 51, 681–708 (2012)
Howard, A.G.: Large margin, transformation learning. PhD Thesis, Graduate School of Arts and Science, Columbia University (2009)
Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekon. Mat. Metod. 12, 747–756 (1976)
Khobotov, E.N.: Modification of the extragradient method for solving variational inequalities and certain optimization problems. USSR Comput. Math. Math. Phys. 27, 120–127 (1987)
Lacoste-Julien, S.: Discriminative machine learning with structure. PhD Thesis, Computer Science, University of California, Berkeley (2009)
Nemirovski, A.: Prox-method with rate of convergence O(1/t) for variational inequality with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15, 229–251 (2005)
Pan, Y.: A game theoretical approach to constrained OSNR optimization problems in optical network. PhD Thesis, Electrical and Computer Engineering, University of Toronto (2009)
Pan, Y., Pavel, L.: Games with coupled propagated constraints in optical networks with multi-link topologies. Automatica 45, 871–880 (2009)
Sha, F.: Large margin training of acoustic models for speech recognition. PhD Thesis, Computer and Information Science, University of Pennsylvania (2007)
Solodov, M.V., Tseng, P.: Modified projection-type methods for monotone variational inequalities. SIAM J. Control Optim. 34, 1814–1830 (1996)
Sun, D.: A class of iterative methods for solving nonlinear projection equations. J. Optim. Theory Appl. 91, 123–140 (1996)
Taji, K., Fukushima, M., Ibaraki, I.: A globally convergent Newton method for solving strongly monotone variational inequalities. Math. Program. 58, 369–383 (1993)
Taskar, B., Lacoste-Julien, S., Jordan, M.I.: Structured prediction, dual extragradient and Bregman projections. J. Mach. Learn. Res. 7, 1627–1653 (2006)
Taskar, B., Lacoste-Julien, S., Jordan, M.I.: Structured prediction via extragradient method. In: Weiss, Y., Schoelkopf, B., Platt, J. (eds.) Advances in Neural Information Processing Systems (NIPS), vol. 18 (2006)
Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. Department of Mathematics, University of Washington, Seattle, WA 98195, USA (2008)
Xue, G.L., Ye, Y.Y.: An efficient algorithm for minimizing a sum of Euclidean norms with applications. SIAM J. Optim. 7, 1017–1036 (1997)
Acknowledgements
The authors thank X.-L. Fu, M. Li, M. Tao and X.-M. Yuan for the discussion and valuable suggestions.
Author information
Authors and Affiliations
Corresponding author
Additional information
X. Cai was supported by the MOEC fund 20110091110004. G. Gu was supported by the NSFC grants 11001124 and 91130007. B. He was supported by the NSFC grant 91130007 and the MOEC fund 20110091110004.
Rights and permissions
About this article
Cite this article
Cai, X., Gu, G. & He, B. On the O(1/t) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators. Comput Optim Appl 57, 339–363 (2014). https://doi.org/10.1007/s10589-013-9599-7
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-013-9599-7