Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Optimality conditions for Tucker low-rank tensor optimization

Published: 13 March 2023 Publication History

Abstract

Optimization problems with tensor variables are widely used in statistics, machine learning, pattern recognition, signal processing, computer vision, etc. Among these applications, the low-rankness of tensors is an intrinsic property that can help unearth potential but important structure or feature in the corresponding high-dimensional multi-way datasets, leading to the study on low-rank tensor optimization (LRTO for short). For the general framework of LRTO, little has been addressed in optimization theory. This motivates us to study the optimality conditions, with special emphasis on the Tucker low-rank constrained problems and the Tucker low-rank decomposition-based reformulations. It is noteworthy that all the involved optimization problems are nonconvex, and even discontinuous, due to the complexity of the tensor Tucker rank function or the multi-linear decomposition with the orthogonality or even group sparsity constraints imposed on factor matrices. By employing the tools in variational analysis, especially the normal cones to low-rank matrices and the properties of matrix manifolds, we propose necessary and/or sufficient optimality conditions for Tucker low-rank tensor optimization problems, which will enrich the context of the nonconvex and nonsmooth optimization.

References

[1]
Absil PA, Mahony R, and Sepulchre R Optimization Algorithms on Matrix Manifolds 2008 Princeton Princeton University Press
[2]
Bi X, Tang X, Yuan Y, Zhang Y, and Annie Q Tensors in statistics Annu. Rev. Stat. Appl. 2021 8 1 345-368
[3]
Candès EJ and Recht B Exact matrix completion via convex optimization Found. Comput. Math. 2009 9 6 717-772
[4]
Candès EJ and Tao T Decoding by linear programming IEEE Trans. Inf. Theory 2005 51 12 4203-4215
[5]
Canyi L, Feng J, Chen Y, Liu W, Lin Z, and Yan S Tensor robust principal component analysis with a new tensor nuclear norm IEEE Trans. Pattern Anal. Mach. Intell. 2020 42 4 925-938
[6]
Che M, Wei Y, and Yan H The computation of low multilinear rank approximations of tensors via power scheme and random projection SIAM J. Matrix Anal. Appl. 2020 41 2 605-636
[7]
Chen B and Li Z On tensor spectral p-norm and its dual norm via partitions Comput. Optim. Appl. 2020 75 609-628
[8]
Chen C, Batselier K, Wenjian Yu, and Wong N Kernelized support tensor train machines Pattern Recogn. 2022 122
[9]
Chen H, Raskutti G, and Yuan M Non-convex projected gradient descent for generalized low-rank tensor regression J. Mach. Learn. Res. 2019 20 172-208
[10]
Chen X, Pan L, and Xiu N Solution sets of three sparse optimization problems for multivariate regression J. Global Optim. 2022
[11]
Cheng M, Jing L, and Michael KN Tensor-based low-dimensional representation learning for multi-view clustering IEEE Trans. Image Process. 2019 28 5 2399-2414
[12]
De Lathauwer L, De Moor B, and Vandewalle J A multilinear singular value decomposition SIAM J. Matrix Anal. Appl. 2000 21 4 1253-1278
[13]
De Lathauwer, L., De Moor, B., Vandewalle, J.: On the best rank-1 and rank-(r1,r2,...,rn) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21, 1324–1342 (2000)
[14]
Ding W and Wei Y Theory and Computation of Tensors: Multi-Dimensional Arrays 2016 New York Elsevier
[15]
Donoho DL Compressed sensing IEEE Trans. Inf. Theory 2006 52 4 1289-1306
[16]
Drakopoulos, G., Spyrou, E., Mylonas, P.: Tensor clustering: a review. In: 2019 14th International Workshop on Semantic and Social Media Adaptation and Personalization (SMAP), pp. 1–6 (2019)
[17]
Eldén L and Savas B Perturbation theory and optimality conditions for the best multilinear rank approximation of a tensor SIAM J. Matrix Anal. Appl. 2011 32 4 1422-1450
[18]
Goldfarb D and Qin Z Robust low-rank tensor recovery: models and algorithms SIAM J. Matrix Anal. Appl. 2014 35 1 225-253
[19]
Hao B, Zhang A, and Cheng G Sparse and low-rank tensor estimation via cubic sketchings IEEE Trans. Inf. Theory 2020 66 9 5927-5964
[20]
Helmke U and Shayman MA Critical points of matrix least squares distance functions Linear Algebra Appl. 1995 215 2 1-19
[21]
Jiang H, Liu X, Wen Z, and Yuan Y A brief introduction to manifold optimization J. Oper. Res. Soc. China 2020 8 2 199-248
[22]
Janzamin, M., Ge, R., Kossaifi, J., Anandkumar, A.: Spectral learning on matrices and tensors. Found. Trends® Mach. Learn. 12, 393–536 (2019)
[23]
Kilmer ME and Martin CD Factorization strategies for third-order tensors Linear Algebra Appl. 2011 435 3 641-658
[24]
Koch O and Lubich C Dynamical tensor approximation SIAM J. Matrix Anal. Appl. 2010 31 5 2360-2375
[25]
Kolda TG and Bader BW Tensor decompositions with applications SIAM Rev. 2009 51 3 455-500
[26]
Kotsia, I., Patras, I.: Support tucker machines. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition, pp. 633–640 (2011)
[27]
Kressner D, Steinlechner M, and Vandereycken B Low-rank tensor completion by Riemannian optimization BIT Numer. Math. 2014 54 2 447-468
[28]
Li X, Da X, Zhou H, and Li L Tucker tensor regression and neuroimaging analysis Stat. Biosci. 2018 10 520-545
[29]
Li X, Song W, and Xiu N Optimality conditions for rank-constrained matrix optimization J. Oper. Res. Soc. China 2019 7 2 285-301
[30]
Lian H Learning rate for convex support tensor machines IEEE Trans. Neural Netw. Learn. Syst. 2021 32 8 3755-3760
[31]
Liu, J., Musialski, P., Wonka, P., Ye, J.: Tensor completion for estimating missing values in visual data. In: 2009 IEEE 12th International Conference on Computer Vision, pp. 2114–2121 (2009)
[32]
Liu, J., Zhu, C., Long, Z., Liu, Y.: Tensor regression. Found. Trends® Mach. Learn. 14(4), 379–565 (2021)
[33]
Liu Y, Liu J, Long Z, and Zhu C Tensor Computation for Data Analysis 2022 Berlin Springer
[34]
Minster R, Saibaba AK, and Kilmer ME Randomized algorithms for low-rank tensor decompositions in the Tucker format SIAM J. Math. Data Sci. 2020 2 1 189-215
[35]
Oseledets I Tensor-train decomposition SIAM J. Sci. Comput. 2011 33 5 2295-2317
[36]
Qi L, Chen H, and Chen Y Tensor Eigenvalues and Their Applications 2018 Berlin Springer
[37]
Qi L, Chen Y, Bakshi M, and Zhang X Triple decomposition and tensor recovery of third order tensors SIAM J. Matrix Anal. Appl. 2021 42 1 299-329
[38]
Qi L and Luo Z Tensor Analysis: Spectral Theory and Special Tensors 2017 Philadelphia SIAM Press
[39]
Rabanser, S., Shchur, O., Günnemann, S.: Introduction to tensor decompositions and their applications in machine learning. arXiv preprint, arXiv:1711.10781 (2017)
[40]
Raskutti G, Yuan M, and Chen H Convex regularization for high-dimensional multiresponse tensor regression Ann. Stat. 2019 47 3 1554-1584
[41]
Rockafellar RT, Wets B, and Roger J Variational Analysis 2013 Berlin Springer
[42]
Russell Luke D Prox-regularity of rank constraint sets and implications for algorithms J. Math. Imaging Vis. 2013 47 3 231-238
[43]
Schneider R and Uschmajew A Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality SIAM J. Optim. 2015 25 1 622-646
[44]
Sidiropoulos ND, De Lathauwer L, Xiao F, Huang K, Papalexakis EE, and Faloutsos C Tensor decomposition for signal processing and machine learning IEEE Trans. Signal Process. 2017 65 13 3551-3582
[45]
Song Q, Ge H, Caverlee J, and Xia H Tensor completion algorithms in big data analytics ACM Trans. Knowl. Discov. Data 2019 13 1 1-48
[46]
Sun, W.W., Hao, B., Li, L.: Tensors in modern statistical learning. Wiley StatsRef: Statistics Reference Online, pp. 1–25 (2021)
[47]
Sun WW and Li L STORE: Sparse tensor response regression and neuroimaging analysis J. Mach. Learn. Res. 2017 18 1-37
[48]
Tao D, Li X, Xindong W, Weiming H, and Maybank SJ Supervised tensor learning Knowl. Inf. Syst. 2007 13 1-42
[49]
Vannieuwenhoven N, Vandebril R, and Meerbergen K A new truncation strategy for the higher-order singular value decomposition SIAM J. Sci. Comput. 2012 34 2 A1027-A1052
[50]
Wang R, Xiu N, and Toh K-C Subspace quadratic regularization method for group sparse multinomial logistic regression Comput. Optim. Appl. 2021 79 531-559
[51]
Xiaotong Yu and Luo Z A sparse tensor optimization approach for background subtraction from compressive measurements Multimedia Tools Appl. 2021 80 26657-26682
[52]
Xiaotong Yu, Luo Z, Qi L, and Yanwei X SLRTA: a sparse and low-rank tensor-based approach to internet traffic anomaly detection Neurocomputing 2021 434 295-314
[53]
Yang W, Zhang L, and Song R Optimality conditions for the nonlinear programming problems on Riemannian manifolds Pacific J. Optim. 2014 10 2 415-434
[54]
Yuan M and Zhang C-H On tensor completion via nuclear norm minimization Found. Comput. Math. 2016 16 1031-1068
[55]
Zhang A, Luo Y, Raskutti G, and Yuan M ISLET: fast and optimal low-rank tensor regression via importance sketching SIAM J. Math. Data Sci. 2020 2 2 444-479
[56]
Zhao, Q., Zhou, G., Xie, S., Zhang, L., Cichocki, A.: Tensor ring decomposition. arXiv preprint, arXiv:1606.05535 (2016)
[57]
Zhou B, Song B, Hassan MM, and Alamri A Multilinear rank support tensor machine for crowd density estimation Eng. Appl. Artif. Intell. 2018 72 1 382-392
[58]
Zhou H, Li L, and Zhu H Tensor regression with applications in neuroimaging data analysis J. Am. Stat. Assoc. 2013 108 502 540-552

Cited By

View all
  • (2023)Preface to Asen L. Dontchev Memorial Special IssueComputational Optimization and Applications10.1007/s10589-023-00537-586:3(795-800)Online publication date: 1-Dec-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Computational Optimization and Applications
Computational Optimization and Applications  Volume 86, Issue 3
Dec 2023
575 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 13 March 2023
Accepted: 16 February 2023
Received: 25 May 2022

Author Tags

  1. Tensor optimization
  2. Optimality conditions
  3. Tucker decomposition
  4. Low-rankness

Qualifiers

  • Research-article

Funding Sources

  • Beijing Natural Science Foundation

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 28 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Preface to Asen L. Dontchev Memorial Special IssueComputational Optimization and Applications10.1007/s10589-023-00537-586:3(795-800)Online publication date: 1-Dec-2023

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media