Abstract
We consider the minimization of a differentiable Lipschitz gradient but non necessarily convex, function F defined on \({\mathbb {R}}^N\). We propose an accelerated gradient descent approach which combines three strategies, namely (i) a variable metric derived from the majorization-minimization principle; (ii) a subspace strategy incorporating information from the past iterates; (iii) a block alternating update. Under the assumption that F satisfies the Kurdyka–Łojasiewicz property, we give conditions under which the sequence generated by the resulting block majorize-minimize subspace algorithm converges to a critical point of the objective function, and we exhibit convergence rates for its iterates.
Similar content being viewed by others
References
Kurdyka, K.: On gradients of functions definable in o-minimal structures. Ann. de l’inst. Fourier 48, 769–783 (1998)
Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of łojasiewicz inequalities and applications, arXiv preprint arXiv:0802.0826 (2008)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45(1), 503–528 (1989)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Yuan, Y.-X.: Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim. Eng. 10(2), 207–218 (2008)
Wald, A., Schuster, T.: Sequential subspace optimization for nonlinear inverse problems. J. Inverse Ill-posed Probl. 25(1), 99–117 (2017)
Bonettini, S., Porta, F., Prato, M., Rebegoldi, S., Ruggiero, V., Zanni, L.: Recent Advances in Variable Metric First-Order Methods, pp. 1–31. Springer International Publishing, Cham (2019)
Frankel, P., Garrigos, G., Peypouquet, J.: Splitting methods with variable metric for Kurdyka–Łojasiewicz functions and general convergence rates. J. Optim. Theory Appl. 165(3), 874–900 (2015)
Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146(1–2), 459–494 (2014)
Sun, Y., Babu, P., Palomar, D.P.: Majorization-minimization algorithms in signal processing, communications, and machine learning. IEEE Trans. Signal Process. 65(3), 794–816 (2016)
Zhang, Z., Kwok, J.T., Yeung, D.-Y.: Surrogate maximization/minimization algorithms and extensions. Mach. Learn. 69, 1–33 (2007)
Robini, M.C., Zhu, Y.: Generic half-quadratic optimization for image reconstruction. SIAM J. Imaging Sci. 8(3), 1752–1797 (2015)
Allain, M., Idier, J., Goussard, Y.: On global and local convergence of half-quadratic algorithms. IEEE Trans. Image Process. 15(5), 1130–1142 (2006)
Chouzenoux, E., Pesquet, J.-C., Repetti, A.: Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function. J. Optim. Theory Appl. 162(1), 107–132 (2014)
Chouzenoux, E., Pesquet, J.-C., Repetti, A.: A block coordinate variable metric forward-backward algorithm. J. Global Optim. 66(3), 457–485 (2016)
Hong, M., Razaviyayn, M., Luo, Z.-Q., Pang, J.-S.: A unified algorithmic framework for block-structured optimization involving big data: with applications in machine learning and signal processing. IEEE Signal Process. Mag. 33(1), 57–77 (2015)
Scutari, G., Sun, Y.: Parallel and Distributed Successive Convex Approximation Methods for Big-Data Optimization. Springer Verlag Series, Cetraro (2018)
Jacobson, M.W., Fessler, J.A.: An expanded theoretical treatment of iteration-dependent majorize-minimize algorithms. IEEE Trans. Image Process. 16(10), 2411–2422 (2007)
Sotthivirat, S., Fessler, J.A.: Image recovery using partitioned-separable paraboloidal surrogate coordinate ascent algorithms. IEEE Trans. Signal Process. 11(3), 306–317 (2002)
Chouzenoux, E., Idier, J., Moussaoui, S.: A majorize-minimize strategy for subspace optimization applied to image restoration. IEEE Trans. Image Process. 20(6), 1517–1528 (2010)
Chouzenoux, E., Jezierska, A., Pesquet, J.-C., Talbot, H.: A majorize-minimize subspace approach for \(\ell _2\)-\(\ell _0\) image regularization. SIAM J. Imaging Sci. 6(1), 563–591 (2013)
Chouzenoux, E., Pesquet, J.-C.: Convergence rate analysis of the majorize-minimize subspace algorithm. IEEE Signal Process. Lett. 23(9), 1284–1288 (2016)
Chouzenoux, E., Martin, S., Pesquet, J.-C.: A local MM subspace method for solving constrained variational problems in image recovery. J. Math. Imaging Vis.. 65(2), 253–276 (2022)
Attouch, H., Bolte, J.: On the convergence of the proximal algorithm for nonsmooth functions involving analytic features. Math. Program. 116(1), 5–16 (2009)
Razaviyayn, M., Hong, M., Luo, Z.-Q.: A unified convergence analysis of block successive minimization methods for nonsmooth optimization. SIAM J. Optim. 23(2), 1126–1153 (2013)
Bertsekas, D.P.: Nonlinear programming. J. Oper. Res. Soc. 48(3), 334–334 (1997)
Miele, A., Cantrell, J.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3(6), 459–470 (1969)
Florescu, A., Chouzenoux, E., Pesquet, J.-C., Ciuciu, P., Ciochina, S.: A majorize-minimize memory gradient method for complex-valued inverse problems. Signal Process. 103, 285–295 (2014)
Cantrell, J.W.: Relation between the memory gradient method and the Fletcher–Reeves method. J. Optim. Theory Appl. 4(1), 67–71 (1969)
Boţ, R.I., Csetnek, E.R.: An inertial Tseng’s type proximal algorithm for nonsmooth and nonconvex optimization problems. J. Optim. Theory Appl. 171(2), 600–616 (2016)
Davis, D.: The asynchronous PALM algorithm for nonsmooth nonconvex problems, arXiv preprint arXiv:1604.00526 (2016)
Nocedal, J., Wright, S.: Numerical Optimization. Springer Science & Business Media, Cham (2006)
Haykin, S.: Blind Deconvolution, 1994
Repetti, A., Pham, M., Duval, L., Chouzenoux, E., Pesquet, J.-C.: Euclid in a Taxicab: sparse blind deconvolution with smoothed l1/l2 regularization. IEEE Signal Process. Lett. 22(5), 539–543 (2015)
Cherni, A., Chouzenoux, E., Duval, L., Pesquet, J.-C.: SPOQ lp-over-lq regularization for sparse signal recovery applied to mass spectrometry. IEEE Trans. Signal Process. 68, 6070–6084 (2020)
Zheng, P., Chouzenoux, E., Duval, L.: PENDANTSS:penalized norm-ratios disentangling additive noise, trend and sparse spikes, Tech. rep., arXiv:2301.01514 (2023)
Bauschke, H., Combettes, P.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces, 2nd edn. Springer, New York (2017)
Cadoni, S., Chouzenoux, E., Pesquet, J.-C., Chaux, C.: A block parallel majorize-minimize memory gradient algorithm, In: 23rd IEEE Int. Conf. Image Process. (ICIP 2016), Phoenix, AZ, 2016, pp. 3194–3198
Hager, W.H., Zhang, H.: Algorithm 851: CG DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Schmidt, M.: minFunc: unconstrained differentiable multivariate optimization in Matlab, Tech. rep. (2005)
Acknowledgements
J.-B. Fest and E. Chouzenoux are with the laboratoire CVN, CentraleSupélec, Inria, Université Paris-Saclay, 9 rue Joliot Curie, 91190 Gif-sur-Yvette, France. Email: first.last@centralesupelec.fr. This work is funded by the European Research Council Starting Grant MAJORIS ERC-2019-STG-850925.
Funding
This research work received funding support from the European Research Council Starting Grant MAJORIS ERC-2019-STG-850925.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Chouzenoux, E., Fest, JB. Convergence analysis of block majorize-minimize subspace approach. Optim Lett 18, 1111–1130 (2024). https://doi.org/10.1007/s11590-023-02055-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-023-02055-z