Abstract
Quasi-Newton methods refer to a class of algorithms at the interface between first and second order methods. They aim to progress as substantially as second order methods per iteration, while maintaining the computational complexity of first order methods. The approximation of second order information by first order derivatives can be expressed as adopting a variable metric, which for (limited memory) quasi-Newton methods is of type “identity ± low rank”. This paper continues the effort to make these powerful methods available for non-smooth systems occurring, for example, in large scale Machine Learning applications by exploiting this special structure. We develop a line search variant of a recently introduced quasi-Newton primal-dual algorithm, which adds significant flexibility, admits larger steps per iteration, and circumvents the complicated precalculation of a certain operator norm. We prove convergence, including convergence rates, for our proposed method and outperform related algorithms in a large scale image deblurring application.
We acknowledge funding by the ANR-DFG joint project TRINOM-DS under the number DFG OC150/5-1.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
For example, the variable \(y^{k}\) defined in (12) is not the dual variable in Algorithm 1.
References
Applegate, D., et al.: Practical large-scale linear programming using primal-dual hybrid gradient. In: Advances in Neural Information Processing Systems, vol. 34 (2021)
Becker, S., Fadili, J.: A quasi-Newton proximal splitting method. In: Advances in Neural Information Processing Systems, vol. 25 (2012)
Becker, S., Fadili, J., Ochs, P.: On quasi-Newton forward-backward splitting: proximal calculus and convergence. SIAM J. Optim. 29(4), 2445–2481 (2019)
Bolte, J., Daniilidis, A., Lewis, A.: Tame functions are semismooth. Math. Program. 117(1), 5–19 (2009)
Byrd, R.H., Nocedal, J., Schnabel, R.B.: Representations of quasi-newton matrices and their use in limited memory methods. Math. Program. 63(1), 129–156 (1994)
Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)
Chambolle, A., Pock, T.: An introduction to continuous optimization for imaging. Acta Numer. 25, 161–319 (2016)
Chambolle, A., Pock, T.: On the ergodic convergence rates of a first-order primal-dual algorithm. Math. Program. 159(1), 253–287 (2016)
Clarke, F.H.: Optimization and Nonsmooth Analysis. Society for Industrial and Applied Mathematics (1990)
Combettes, P., Condat, L., Pesquet, J.C., Vu, B.: A forward-backward view of some primal-dual optimization methods in image recovery. In: IEEE International Conference on Image Processing (2014)
Combettes, P.L., Vũ, B.C.: Variable metric forward-backward splitting with applications to monotone inclusions in duality. Optimization 63(9), 1289–1318 (2014)
Davis, D.: Convergence rate analysis of primal-dual splitting schemes. SIAM J. Optim. 25(3), 1912–1943 (2015)
Fletcher, R.: Practical Methods of Optimization. Wiley, Hoboken (2013)
Goldstein, T., Li, M., Yuan, X., Esser, E., Baraniuk, R.: Adaptive primal-dual hybrid gradient methods for saddle-point problems. arXiv:1305.0546 (2013)
Kanzow, C., Lechner, T.: Globalized inexact proximal newton-type methods for nonconvex composite functions. Comput. Optim. Appl. 78, 377–410 (2021)
Kanzow, C., Lechner, T.: Efficient regularized proximal quasi-Newton methods for large-scale nonconvex composite optimization problems. Technical report, University of Würzburg, Institute of Mathematics, January 2022
Karimi, S., Vavasis, S.: IMRO: a proximal quasi-Newton method for solving l1-regularized least squares problems. SIAM J. Optim. 27(2), 583–615 (2017)
Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM J. Optim. 24(3), 1420–1443 (2014)
Lorenz, D.A., Pock, T.: An inertial forward-backward algorithm for monotone inclusions. J. Math. Imaging Vis. 51(2), 311–325 (2015)
Malitsky, Y., Pock, T.: A first-order primal-dual algorithm with linesearch. SIAM J. Optim. 28(1), 411–432 (2018)
Patrinos, P., Stella, L., Bemporad, A.: Forward-backward truncated Newton methods for convex composite optimization. arXiv:1402.6655 (2014)
Polyak, B.: Introduction to optimization. Optimization Software (1987)
Schmidt, M., Kim, D., Sra, S.: Projected Newton-type methods in machine learning. In: Optimization for Machine Learning, no. 1 (2012)
Stella, L., Themelis, A., Patrinos, P.: Forward-backward quasi-Newton methods for nonsmooth optimization problems. Comput. Optim. Appl. 67(3), 443–487 (2017)
Valkonen, T.: A primal-dual hybrid gradient method for nonlinear operators with applications to MRI. Inverse Prob. 30(5), 055012 (2014)
Vardi, Y., Shepp, L.A., Kaufman, L.: A statistical model for positron emission tomography. J. Am. Stat. Assoc. 80(389), 8–20 (1985)
Wang, S., Fadili, J., Ochs, P.: Inertial quasi-newton methods for monotone inclusion: efficient resolvent calculus and primal-dual methods. arXiv:2209.14019 (2022)
Wright, S., Nocedal, J.: Numerical Optimization. Springer, New York (1999)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, S., Fadili, J., Ochs, P. (2023). A Quasi-Newton Primal-Dual Algorithm with Line Search. In: Calatroni, L., Donatelli, M., Morigi, S., Prato, M., Santacesaria, M. (eds) Scale Space and Variational Methods in Computer Vision. SSVM 2023. Lecture Notes in Computer Science, vol 14009. Springer, Cham. https://doi.org/10.1007/978-3-031-31975-4_34
Download citation
DOI: https://doi.org/10.1007/978-3-031-31975-4_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-31974-7
Online ISBN: 978-3-031-31975-4
eBook Packages: Computer ScienceComputer Science (R0)