Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization

Published: 20 February 2024 Publication History

Abstract

This paper focuses on the minimization of a sum of a twice continuously differentiable function f and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of f involving the ϱth power of the KKT residual. For ϱ=0, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For ϱ(0,1), by assuming that cluster points satisfy a locally Hölderian error bound of order q on a second-order stationary point set and a local error bound of order q>1+ϱ on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on q and ϱ. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on 1-regularized Student’s t-regressions, group penalized Student’s t-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.

References

[1]
Tibshirani R Regression shrinkage and selection via the lasso J. R. Stat. Soc. Ser. B Stat. Methodol. 1996 58 1 267-288
[2]
Yuan M and Lin Y Model selection and estimation in regression with grouped variables J. R. Stat. Soc. Ser. B Stat. Methodol. 2006 68 1 49-67
[3]
Bonettini S, Loris I, Porta F, Prato M, and Rebegoldi S On the convergence of a linesearch based proximal-gradient method for nonconvex optimization Inverse Problems 2017 33 5
[4]
Pham Dinh T and Niu Y-S An efficient DC programming approach for portfolio decision with higher moments Comput. Optim. Appl. 2011 50 3 525-554
[5]
Zhou R and Palomar DP Solving high-order portfolios via successive convex approximation algorithms IEEE Trans. Signal Process. 2021 69 892-904
[6]
Fukushima M and Mine H A generalized proximal point algorithm for certain non-convex minimization problems Int. J. Syst. Sci. 1981 12 8 989-1000
[7]
Tseng P and Yun S A coordinate gradient descent method for non-smooth separable minimization Math. Program. 2009 117 1 387-423
[8]
Milzarek A and Ulbrich M A semismooth Newton method with multidimensional filter globalization for l1-optimization SIAM J. Optim. 2014 24 1 298-333
[9]
Bonettini S, Prato M, and Rebegoldi S Convergence of inexact forward–backward algorithms using the forward–backward envelope SIAM J. Optim. 2020 30 4 3069-3097
[10]
Bonettini S, Loris I, Porta F, and Prato M Variable metric inexact line-search-based methods for nonsmooth optimization SIAM J. Optim. 2016 26 2 891-921
[11]
Patrinos, P., Bemporad, A.: Proximal Newton methods for convex composite optimization. In: 52nd IEEE Conference on Decision and Control, pp. 2358–2363. IEEE (2013).
[12]
Stella L, Themelis A, and Patrinos P Forward–backward quasi-Newton methods for nonsmooth optimization problems Comput. Optim. Appl. 2017 67 3 443-487
[13]
Themelis A, Stella L, and Patrinos P Forward–backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms SIAM J. Optim. 2018 28 3 2274-2303
[14]
Fischer A Local behavior of an iterative framework for generalized equations with nonisolated solutions Math. Program. 2002 94 1, Ser. A 91-124
[15]
Yue M-C, Zhou Z, and So AM-C A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo–Tseng error bound property Math. Program. 2019 174 1 327-358
[16]
Friedman J, Hastie T, Höfling H, and Tibshirani R Pathwise coordinate optimization Ann. Appl. Stat. 2007 1 2 302-332
[17]
Yuan G-X, Ho C-H, and Lin C-J An improved GLMNET for 1-regularized logistic regression J. Mach. Learn. Res. 2012 13 64 1999-2030
[18]
Hsieh, C.-J., Sustik, M.A., Dhillon, I.S., Ravikumar, P.: Sparse inverse covariance matrix estimation using quadratic approximation. In: Proceedings of the 24th International Conference on Neural Information Processing Systems, pp. 2330–2338. Curran Associates Inc., Red Hook (2011)
[19]
Oztoprak, F., Nocedal, J., Rennie, S., Olsen, P.A.: Newton-like methods for sparse inverse covariance estimation. In: Advances in Neural Information Processing Systems, vol. 25. Curran Associates Inc., Red Hook (2012)
[20]
Lee JD, Sun Y, and Saunders MA Proximal Newton-type methods for minimizing composite functions SIAM J. Optim. 2014 24 3 1420-1443
[21]
Li J, Andersen MS, and Vandenberghe L Inexact proximal Newton methods for self-concordant functions Math. Methods Oper. Res. 2017 85 1 19-41
[22]
Tran-Dinh Q, Kyrillidis A, and Cevher V Composite self-concordant minimization J. Mach. Learn. Res. 2015 16 12 371-416
[23]
Mordukhovich BS, Yuan X, Zeng S, and Zhang J A globally convergent proximal Newton-type method in nonsmooth convex optimization Math. Program. 2023 198 1 899-936
[24]
Byrd RH, Nocedal J, and Oztoprak F An inexact successive quadratic approximation method for 1 regularized optimization Math. Program. 2016 157 2, Ser. B 375-396
[25]
Lee C and Wright SJ Inexact successive quadratic approximation for regularized optimization Comput. Optim. Appl. 2019 72 3 641-674
[26]
Kanzow C and Lechner T Globalized inexact proximal Newton-type methods for nonconvex composite functions Comput. Optim. Appl. 2021 78 2 377-410
[27]
Ueda K and Yamashita N Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization Appl. Math. Optim. 2010 62 1 27-46
[28]
Yu, P., Li, G., Pong, T.K.: Kurdyka–Łojasiewicz exponent via inf-projection. Found. Comput. Math. 1–47 (2021).
[29]
Rockafellar RT and Wets RJ-B Variational Analysis 1998 Heidelberg Springer
[30]
Beck A and Hallak N Optimization problems involving group sparsity terms Math. Program. 2019 178 1–2, Ser. A 39-67
[31]
Attouch H, Bolte J, Redont P, and Soubeyran A Proximal alternating minimization and projection methods for nonconvex problems: an approach based on the Kurdyka-Łojasiewicz inequality Math. Oper. Res. 2010 35 2 438-457
[32]
Attouch H, Bolte J, and Svaiter BF Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods Math. Program. 2013 137 1–2, Ser. A 91-129
[33]
Li G and Pong TK Calculus of the exponent of Kurdyka–Łojasiewicz inequality and its applications to linear convergence of first-order methods Found. Comput. Math. 2018 18 5 1199-1232
[34]
Wu Y, Pan S, and Bi S Kurdyka–Łojasiewicz property of zero-norm composite functions J. Optim. Theory Appl. 2021 188 1 94-112
[35]
Li G and Mordukhovich BS Hölder metric subregularity with applications to proximal point method SIAM J. Optim. 2012 22 4 1655-1684
[36]
Dong Y An extension of Luque’s growth condition Appl. Math. Lett. 2009 22 9 1390-1393
[37]
Drusvyatskiy D and Lewis AS Error bounds, quadratic growth, and linear convergence of proximal methods Math. Oper. Res. 2018 43 3 919-948
[38]
Luo Z-Q and Tseng P Error bounds and convergence analysis of feasible descent methods: a general approach Ann. Oper. Res. 1993 46 1–4 157-178
[39]
Pan, S., Liu, Y.: Subregularity of subdifferential mappings relative to the critical set and KL property of exponent 1/2 (2019). arXiv:1812.00558
[40]
Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Applied Optimization, vol. 87. Kluwer Academic Publishers, Boston (2004)
[41]
Mordukhovich BS and Ouyang W Higher-order metric subregularity and its applications J. Glob. Optim. 2015 63 4 777-795
[42]
Bolte J, Nguyen TP, Peypouquet J, and Suter BW From error bounds to the complexity of first-order descent methods for convex functions Math. Program. 2017 165 2 471-507
[43]
Rockafellar, R.T.: Convex analysis. In: Princeton Mathematical Series, No. 28. Princeton University Press, Princeton (1970)
[44]
Bolte J, Sabach S, and Teboulle M Proximal alternating linearized minimization for nonconvex and nonsmooth problems Math. Program. 2014 146 1–2, Ser. A 459-494
[45]
Attouch H and Bolte J On the convergence of the proximal algorithm for nonsmooth functions involving analytic features Math. Program. 2009 116 1–2, Ser. B 5-16
[46]
Li X, Sun D, and Toh K-C A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems SIAM J. Optim. 2018 28 1 433-458
[47]
Liu T and Takeda A An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems Comput. Optim. Appl. 2022 82 1 141-173
[48]
Ioffe AD An invitation to tame optimization SIAM J. Optim. 2008 19 4 1894-1917
[49]
Bolte J, Daniilidis A, and Lewis A Tame functions are semismooth Math. Program. 2009 117 1–2 5-19
[50]
Facchinei, F., Pang, J.-S.: Finite-dimensional variational inequalities and complementarity problems, vol. II. In: Springer Series in Operations Research. Springer, New York (2003)
[51]
Clarke, F.H.: Optimization and nonsmooth analysis. In: Classics in Applied Mathematics, vol. 5, 2nd edn. Society for Industrial and Applied Mathematics (SIAM), Philadelphia (1990)
[52]
Qi LQ and Sun J A nonsmooth version of Newton’s method Math. Program. 1993 58 3, Ser. A 353-367
[53]
Beck A and Teboulle M A fast iterative shrinkage-thresholding algorithm for linear inverse problems SIAM J. Imaging Sci. 2009 2 1 183-202
[54]
Aravkin A, Friedlander MP, Herrmann FJ, and Leeuwen T Robust inversion, dimensionality reduction, and randomized sampling Math. Program. 2012 134 1, Ser. B 101-125
[55]
Becker S, Bobin J, and Candès EJ NESTA: a fast and accurate first-order method for sparse recovery SIAM J. Imaging Sci. 2011 4 1 1-39
[56]
Bertsekas, D.P.: Nonlinear Programming, 2nd edn. Athena Scientific, Belmont (1999)
[57]
Ngai, H., Théra, M.: Error bounds for systems of lower semicontinuous functions in Asplund spaces. Math. Program. 116(1–2, Ser. B), 397–427 (2009).

Cited By

View all
  • (2025)An Inexact Proximal Newton Method for Nonconvex Composite MinimizationJournal of Scientific Computing10.1007/s10915-025-02805-4102:3Online publication date: 1-Mar-2025
  • (2024)A VMiPG Method for Composite Optimization with Nonsmooth Term Having No Closed-form Proximal MappingJournal of Scientific Computing10.1007/s10915-024-02712-0101:3Online publication date: 4-Nov-2024
  • (2024)An inexact regularized proximal Newton method without line searchComputational Optimization and Applications10.1007/s10589-024-00600-989:3(585-624)Online publication date: 1-Dec-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Computational Optimization and Applications
Computational Optimization and Applications  Volume 88, Issue 2
Jun 2024
311 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 20 February 2024
Accepted: 18 January 2024
Received: 26 June 2023

Author Tags

  1. Nonconvex and nonsmooth optimization
  2. Regularized proximal Newton method
  3. Global convergence
  4. Convergence rate
  5. KL function
  6. Metric q-subregularity

Author Tags

  1. 90C26
  2. 49M15
  3. 90C55

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)An Inexact Proximal Newton Method for Nonconvex Composite MinimizationJournal of Scientific Computing10.1007/s10915-025-02805-4102:3Online publication date: 1-Mar-2025
  • (2024)A VMiPG Method for Composite Optimization with Nonsmooth Term Having No Closed-form Proximal MappingJournal of Scientific Computing10.1007/s10915-024-02712-0101:3Online publication date: 4-Nov-2024
  • (2024)An inexact regularized proximal Newton method without line searchComputational Optimization and Applications10.1007/s10589-024-00600-989:3(585-624)Online publication date: 1-Dec-2024

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media