Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Regularization Parameter Selection for the Low Rank Matrix Recovery

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

A popular approach to recover low rank matrices is the nuclear norm regularized minimization (NRM) for which the selection of the regularization parameter is inevitable. In this paper, we build up a novel rule to choose the regularization parameter for NRM, with the help of the duality theory. Our result provides a safe set for the regularization parameter when the rank of the solution has an upper bound. Furthermore, we apply this idea to NRM with quadratic and Huber functions, and establish simple formulae for the regularization parameters. Finally, we report numerical results on some signal shapes by embedding our rule into the cross validation, which state that our rule can reduce the computational time for the selection of the regularization parameter. To the best of our knowledge, this is the first attempt to select the regularization parameter for the low rank matrix recovery.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. http://www.dabi.temple.edu/shape/MPEG7/index.html.

References

  1. Bottou, L., Curtis, E.F., Nocedal, J.: Optimization methods for large-scale machine learning. SIAM Rev. 60(2), 223–311 (2018)

    Article  MathSciNet  Google Scholar 

  2. Cai, T.T., Zhang, A.: Sharp RIP bound for sparse signal and low-rank matrix recovery. Appl. Comput. Harmon. Anal. 35(1), 74–93 (2013)

    Article  MathSciNet  Google Scholar 

  3. Chen, S., Donoho, D., Saunders, M.: Atomic decomposition for basis pursuit. SIAM J. Sci. Comput. 20(1), 33–61 (1998)

    Article  MathSciNet  Google Scholar 

  4. Chen, C., Wei, C., Wang, F.Y.: Low-rank matrix recovery with structural incoherence for robust face recognition. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2618–2625 (2012)

  5. Davenport, A.M., Romberg, J.: An overview of low-rank matrix recovery from incomplete observations. IEEE J-STSP 10(4), 608–622 (2016)

    Google Scholar 

  6. Elsener, A., Geer, S.: Robust low-rank matrix estimation. Ann. Stat. 48, 3481–3509 (2018)

    MathSciNet  MATH  Google Scholar 

  7. Fan, J., Lv, J.: Sure independence screening for ultrahigh dimensional feasure space (with discussion). J. R. Stat. Soc. B 70, 849–911 (2008)

    Article  Google Scholar 

  8. Fazel, M.: Matrix rank minimization with applications. Ph.D. thesis, Stanford University (2002)

  9. Fazel, M., Pong, T.K., Sun, D., Tseng, P.: Hankel matrix rank minization with applications to system identification and realization. SIAM J. Matrix Anal. Appl. 34(3), 946–977 (2013)

    Article  MathSciNet  Google Scholar 

  10. Ghaoui, E.L., Viallon, V., Rabbani, T.: Safe feature elimination in sparse supervised learning. Pac. J. Optim. 8(4), 667–698 (2012)

    MathSciNet  MATH  Google Scholar 

  11. Hiriart-Urruty, J.-B., Lemaréchal, C.: Convex Analysis and Minization Algorithms. Springer, Berlin (1993)

    MATH  Google Scholar 

  12. Horn, R.A., Johnson, C.R.: Matrix Analysis, 2nd edn. Cambridge University Press, Cambridge (2013)

    MATH  Google Scholar 

  13. Huber, P.J.: Robust regression: asymptotics, conjectures and Monte Carlo. Ann. Stat. 1, 799–821 (1973)

    MathSciNet  MATH  Google Scholar 

  14. Koltchinskii, V., Lounici, K., Tsybakov, A.B.: Nuclear norm penalization and optimal rates for noisy low rank matrix completion. Ann. Stat. 39, 2302–2329 (2011)

    MathSciNet  MATH  Google Scholar 

  15. Kong, L. C., Tunçel, L. and Xiu, N. H.: s-goodness for low-rank matrix recovery. Abstr. Appl. Anal. 101974, 9 pages (2013)

  16. Kuang, Z.B., Geng, S.N., Page, D.: A screening rule for \(l_{1}\)-regularized ising model estimation. Adv. Neural. Inf. Process Syst. 30, 720–731 (2017)

    Google Scholar 

  17. Lee, S., Gornitz, N., Xing, E.P., Heckerman, D., Lippert, C.: Ensembles of Lasso screening rules. IEEE Trans. Pattern Anal. 40(12), 2841–2852 (2018)

    Article  Google Scholar 

  18. Liu, Z., Vandenberghe, L.: Interior-point method for nuclear norm approximation with application to system identification. SIAM J. Matrix Anal. Appl. 31(3), 1235–1256 (2009)

    Article  MathSciNet  Google Scholar 

  19. Lu, Z., Monteiro, R.D.C., Yuan, M.: Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression. Math. Program. 131, 163–194 (2010)

    Article  MathSciNet  Google Scholar 

  20. Ndiaye, E., Fercoq, O., Gramfort, A., Salmon, J.: Gap safe screening rules for sparsity enforcing penalties. J. Mach. Learn. Res. 18, 1–33 (2017)

    MathSciNet  MATH  Google Scholar 

  21. Negahban, S., Wainwright, M.J.: Estimation of (near) low-rank matrices with noise and high-dimensional scaling. Ann. Stat. 39(2), 1069–1097 (2011)

    Article  MathSciNet  Google Scholar 

  22. Recht, B., Fazel, M., Parillo, P.: Guaranteed minimum rank solutions to linear matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)

    Article  MathSciNet  Google Scholar 

  23. Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970)

    Book  Google Scholar 

  24. Rohde, A., Tsybakov, A.B.: Estimation of high-dimensional low-rank matrices. Ann. Stat. 39, 887–930 (2011)

    Article  MathSciNet  Google Scholar 

  25. Sun, Q., Zhou, W., Fan, J.: Adaptive Huber regression. J. Am. Stat. Assoc. 115(529), 254–265 (2020)

    Article  MathSciNet  Google Scholar 

  26. Tibshirani, R., Bien, J., Hastie, T., Simon, N., Taylor, J., Tibshirani, R.J.: Strong rules for discarding predictors in lasso-type problems. J. R. Stat. Soc. B 74(2), 1–22 (2012)

    Article  MathSciNet  Google Scholar 

  27. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. B 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  28. Wang, J., Wonka, P., Ye, J.: Lasso screening rules via dual polytope projection. J. Mach. Learn. Res. 16, 1063–1101 (2015)

    MathSciNet  MATH  Google Scholar 

  29. Watson, G.A.: Characterization of the subdifferential of some matrix norms. Linear Algebra Appl. 170, 33–45 (1992)

    Article  MathSciNet  Google Scholar 

  30. Xiang, Z.J., Wang, Y., Ramadge, J.P.: Screening tests for Lasso problems. IEEE. Trans. Pattern Anal. 5(39), 1008–1027 (2017)

    Article  Google Scholar 

  31. Yuan, M., Ekici, A., Lu, Z., Monteiro, R.D.C.: Dimension reduction and coefficient estimation in multivariate linear regression. J. R. Stat. Soc. Ser. B 69, 329–346 (2007)

    Article  MathSciNet  Google Scholar 

  32. Zhao, B., Justin, P.H., Cornelius, B., Liang, Z.P.: Low rank matrix recovery for real-time cardiac MRI. In: International Symposium on Biomedical Imaging, pp. 996–999 (2010)

  33. Zhou, H., Li, L.X.: Regularized matrix regression. J. R. Stat. Soc. B 76(2), 463–483 (2014)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We sincerely thank the referees as well as the associate editor for their constructive comments which have significantly improved the quality of the paper. This work was supported by the National Natural Science Foundation of China (12071022).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pan Shang.

Additional information

Communicated by Levent Tunçel.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shang, P., Kong, L. Regularization Parameter Selection for the Low Rank Matrix Recovery. J Optim Theory Appl 189, 772–792 (2021). https://doi.org/10.1007/s10957-021-01852-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-021-01852-9

Keywords

Mathematics Subject Classification