Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3132847.3132945acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article

Tensor Rank Estimation and Completion via CP-based Nuclear Norm

Published: 06 November 2017 Publication History
  • Get Citation Alerts
  • Abstract

    Tensor completion (TC) is a challenging problem of recovering missing entries of a tensor from its partial observation. One main TC approach is based on CP/Tucker decomposition. However, this approach often requires the determination of a tensor rank a priori. This rank estimation problem is difficult in practice. Several Bayesian solutions have been proposed but they often under/over-estimate the tensor rank while being quite slow. To address this problem of rank estimation with missing entries, we view the weight vector of the orthogonal CP decomposition of a tensor to be analogous to the vector of singular values of a matrix. Subsequently, we define a new CP-based tensor nuclear norm as the $L_1$-norm of this weight vector. We then propose Tensor Rank Estimation based on $L_1$-regularized orthogonal CP decomposition (TREL1) for both CP-rank and Tucker-rank. Specifically, we incorporate a regularization with CP-based tensor nuclear norm when minimizing the reconstruction error in TC to automatically determine the rank of an incomplete tensor. Experimental results on both synthetic and real data show that: 1) Given sufficient observed entries, TREL1 can estimate the true rank (both CP-rank and Tucker-rank) of incomplete tensors well; 2) The rank estimated by TREL1 can consistently improve recovery accuracy of decomposition-based TC methods; 3) TREL1 is not sensitive to its parameters in general and more efficient than existing rank estimation methods.

    References

    [1]
    Evrim Acar, Daniel M Dunlavy, Tamara G Kolda, and Morten Mørup. 2011. Scalable tensor factorizations for incomplete data. Chemometrics and Intelligent Laboratory Systems, Vol. 106, 1 (2011), 41--56.
    [2]
    Animashree Anandkumar, Rong Ge, Daniel Hsu, Sham M Kakade, and Matus Telgarsky. 2014. Tensor decompositions for learning latent variable models. Journal of Machine Learning Research Vol. 15, 1 (2014), 2773--2832.
    [3]
    Juan Andrés Bazerque, Gonzalo Mateos, and Georgios B Giannakis. 2013. Rank regularization and Bayesian inference for tensor completion and extrapolation. IEEE Transactions on Signal Processing Vol. 61, 22 (2013), 5689--5703.
    [4]
    Göran Bergqvist and Erik G Larsson. 2010. The higher-order singular value decomposition: Theory and an application. IEEE Signal Processing Magazine Vol. 27, 3 (2010), 151--154.
    [5]
    Emmanuel J Candès and Benjamin Recht. 2009. Exact matrix completion via convex optimization. Foundations of Computational Mathematics Vol. 9, 6 (2009), 717--772.
    [6]
    J Douglas Carroll and Jih-Jie Chang. 1970. Analysis of individual differences in multidimensional scaling via an N-way generalization of "Eckart-Young" decomposition. Psychometrika, Vol. 35, 3 (1970), 283--319.
    [7]
    Yi-Lei Chen, Chiou-Ting Hsu, and Hong-Yuan Mark Liao. 2014. Simultaneous tensor decomposition and completion using factor priors. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 36, 3 (2014), 577--591.
    [8]
    Wei Chu and Zoubin Ghahramani. 2009. Probabilistic models for incomplete multi-dimensional arrays Proceedings of the 12th International Conference on Artificial Intelligence and Statistics. Citeseer.
    [9]
    Jean Baptiste Denis and Thierry Dhorne. 1989. Orthogonal tensor decomposition of 3-way tables. Multiway Data Analysis. 31--37.
    [10]
    Marko Filipović and Ante Jukić. 2015. Tucker factorization with missing data with application to low-n-rank tensor completion. Multidimensional Systems and Signal Processing, Vol. 26, 3 (2015), 677--692.
    [11]
    Silvia Gandy, Benjamin Recht, and Isao Yamada. 2011. Tensor completion and low-n-rank tensor recovery via convex optimization. Inverse Problems, Vol. 27, 2 (2011), 25010--25028.
    [12]
    Richard A Harshman. 1970. Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-modal factor analysis. UCLA Working Papers in Phonetics (1970), 1--84.
    [13]
    Christopher J Hillar and Lek-Heng Lim. 2013. Most tensor problems are NP-hard. J. ACM Vol. 60, 6 (2013), 45.
    [14]
    Prateek Jain and Sewoong Oh. 2014. Provable tensor factorization with missing data. Advances in Neural Information Processing Systems. 1431--1439.
    [15]
    Hiroyuki Kasai and Bamdev Mishra. 2016. Low-rank tensor completion: a Riemannian manifold preconditioning approach Proceedings of The 33rd International Conference on Machine Learning. 1012--1021.
    [16]
    Tamara G Kolda and Brett W Bader. 2009. Tensor decompositions and applications. SIAM Rev. Vol. 51, 3 (2009), 455--500.
    [17]
    Wim P Krijnen, Theo K Dijkstra, and Alwin Stegeman. 2008. On the non-existence of optimal solutions and the occurrence of "degeneracy" in the CANDECOMP/PARAFAC Model. Psychometrika, Vol. 73, 3 (2008), 431--439.
    [18]
    Ji Liu, Przemyslaw Musialski, Peter Wonka, and Jieping Ye. 2009. Tensor completion for estimating missing values in visual data Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2114--2121.
    [19]
    Ji Liu, Przemyslaw Musialski, Peter Wonka, and Jieping Ye. 2013. Tensor completion for estimating missing values in visual data. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 35, 1 (2013), 208--220.
    [20]
    Yuanyuan Liu, Fanhua Shang, Wei Fan, James Cheng, and Hong Cheng. 2014. Generalized higher-order orthogonal iteration for tensor decomposition and completion Advances in Neural Information Processing Systems. 1763--1771.
    [21]
    Yuanyuan Liu, Fanhua Shang, Licheng Jiao, James Cheng, and Hong Cheng. 2015. Trace norm regularized CANDECOMP/PARAFAC decomposition with missing data. IEEE Transactions on Cybernetics Vol. 45, 11 (2015), 2437--2448.
    [22]
    Haiping Lu, Konstantinos N Plataniotis, and Anastasios Venetsanopoulos. 2013. Multilinear Subspace Learning: Dimensionality Reduction of Multidimensional Data. CRC press.
    [23]
    Morten Mørup and Lars Kai Hansen. 2009. Automatic relevance determination for multi-way models. Journal of Chemometrics Vol. 23, 7--8 (2009), 352--363.
    [24]
    Cun Mu, Bo Huang, John Wright, and Donald Goldfarb. 2014. Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery Proceedings of the 31st International Conference on Machine Learning. 73--81.
    [25]
    Stanley Osher, Yu Mao, Bin Dong, and Wotao Yin. 2011. Fast linearized Bregman iteration for compressive sensing and sparse denoising. arXiv preprint arXiv:1104.0262 (2011).
    [26]
    Yuan Alan Qi, Thomas P Minka, Rosalind W Picard, and Zoubin Ghahramani. 2004. Predictive automatic relevance determination by expectation propagation Proceedings of the 21st International Conference on Machine Learning. ACM, 85.
    [27]
    Piyush Rai, Yingjian Wang, Shengbo Guo, Gary Chen, David Dunson, and Lawrence Carin. 2014. Scalable Bayesian low-rank decomposition of incomplete multiway tensors Proceedings of the 31st International Conference on Machine Learning. 1800--1808.
    [28]
    Wenjie Ruan, Peipei Xu, Quan Z Sheng, Nguyen Khoi Tran, Nickolas JG Falkner, Xue Li, and Wei Emma Zhang. 2016. When Sensor Meets Tensor: Filling Missing Sensor Values Through a Tensor Approach Proceedings of the 25th ACM International Conference on Conference on Information and Knowledge Management. ACM, 2025--2028.
    [29]
    Fanhua Shang, Yuanyuan Liu, James Cheng, and Hong Cheng. 2014. Robust principal component analysis with missing data Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management. ACM, 1149--1158.
    [30]
    Qiquan Shi, Yiu-ming Cheung, and Qibin Zhao. 2017. Feature Extraction for Incomplete Data via Low-rank Tucker Decomposition Joint European Conference on Machine Learning and Knowledge Discovery in Databases (Accepted).
    [31]
    Kijung Shin, Lee Sael, and U Kang. 2017. Fully scalable methods for distributed tensor factorization. IEEE Transactions on Knowledge and Data Engineering, Vol. 29, 1 (2017), 100--113.
    [32]
    Marco Signoretto, Quoc Tran Dinh, Lieven De Lathauwer, and Johan AK Suykens. 2014. Learning with tensors: a framework based on convex optimization and spectral regularization. Machine Learning, Vol. 94, 3 (2014), 303--351.
    [33]
    Vincent YF Tan and Cédric Févotte. 2013. Automatic relevance determination in nonnegative matrix factorization with the β $-divergence. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 35, 7 (2013), 1592--1605.
    [34]
    Giorgio Tomasi and Rasmus Bro. 2005. PARAFAC and missing values. Chemometrics and Intelligent Laboratory Systems, Vol. 75, 2 (2005), 163--180.
    [35]
    Ledyard R Tucker. 1963. Implications of factor analysis of three-way matrices for measurement of change. Problems in Measuring Change Vol. 15 (1963), 122--137.
    [36]
    Yichen Wang, Robert Chen, Joydeep Ghosh, Joshua C Denny, Abel Kho, You Chen, Bradley A Malin, and Jimeng Sun Rubik: Knowledge guided tensor factorization and completion for health data analytics Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.
    [37]
    David P Wipf and Srikantan S Nagarajan. 2008. A new view of automatic relevance determination. Advances in Neural Information Processing Systems. 1625--1632.
    [38]
    Yuning Yang, Yunlong Feng, and Johan AK Suykens. 2016. Robust low-rank tensor recovery with regularized redescending M-estimator. IEEE Transactions on Neural Networks and Learning Systems, Vol. 27, 9 (2016), 1933--1946.
    [39]
    Tatsuya Yokota, Namgil Lee, and Andrzej Cichocki. 2017. Robust multilinear tensor rank estimation using higher order singular value decomposition and information criteria. IEEE Transactions on Signal Processing Vol. 65, 5 (2017), 1196--1206.
    [40]
    Tong Zhang and Gene H Golub. 2001. Rank-one approximation to high order tensors. SIAM J. Matrix Anal. Appl. Vol. 23, 2 (2001), 534--550.
    [41]
    Zemin Zhang, Gregory Ely, Shuchin Aeron, Ning Hao, and Misha Kilmer. 2014. Novel methods for multilinear data completion and de-noising based on tensor-SVD Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 3842--3849.
    [42]
    Qibin Zhao, Liqing Zhang, and Andrzej Cichocki. 2015. Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 37, 9 (2015), 1751--1763.
    [43]
    Qibin Zhao, Guoxu Zhou, Liqing Zhang, Andrzej Cichocki, and Shun-Ichi Amari. 2016. Bayesian robust tensor factorization for incomplete multiway data. IEEE Transactions on Neural Networks and Learning Systems, Vol. 27, 4 (2016), 736--748.

    Cited By

    View all
    • (2024)Efficient enhancement of low-rank tensor completion via thin QR decompositionFrontiers in Big Data10.3389/fdata.2024.13821447Online publication date: 2-Jul-2024
    • (2024)Single Finger Trajectory Prediction From Intracranial Brain Activity Using Block-Term Tensor Regression With Fast and Automatic Component ExtractionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.321658935:7(8897-8908)Online publication date: Jul-2024
    • (2023)Unmixing aware compression of hyperspectral image by rank aware orthogonal parallel factorization decompositionJournal of Applied Remote Sensing10.1117/1.JRS.17.04650917:04Online publication date: 1-Oct-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CIKM '17: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management
    November 2017
    2604 pages
    ISBN:9781450349185
    DOI:10.1145/3132847
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 November 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cp decomposition
    2. cp-based tensor nuclear norm
    3. tensor completion
    4. tensor rank estimation

    Qualifiers

    • Research-article

    Funding Sources

    • HKBU Faculty Research Grant
    • SZSTI Grant
    • NSFC
    • HKBU KTO grant

    Conference

    CIKM '17
    Sponsor:

    Acceptance Rates

    CIKM '17 Paper Acceptance Rate 171 of 855 submissions, 20%;
    Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)35
    • Downloads (Last 6 weeks)2

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Efficient enhancement of low-rank tensor completion via thin QR decompositionFrontiers in Big Data10.3389/fdata.2024.13821447Online publication date: 2-Jul-2024
    • (2024)Single Finger Trajectory Prediction From Intracranial Brain Activity Using Block-Term Tensor Regression With Fast and Automatic Component ExtractionIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.321658935:7(8897-8908)Online publication date: Jul-2024
    • (2023)Unmixing aware compression of hyperspectral image by rank aware orthogonal parallel factorization decompositionJournal of Applied Remote Sensing10.1117/1.JRS.17.04650917:04Online publication date: 1-Oct-2023
    • (2023)Tensor Decompositions for Hyperspectral Data Processing in Remote Sensing: A comprehensive reviewIEEE Geoscience and Remote Sensing Magazine10.1109/MGRS.2022.322706311:1(26-72)Online publication date: Mar-2023
    • (2022)Tensor Completion via Complementary Global, Local, and Nonlocal PriorsIEEE Transactions on Image Processing10.1109/TIP.2021.313832531(984-999)Online publication date: 2022
    • (2022)On The Relaxation of Orthogonal Tensor Rank and Its Nonconvex Riemannian Optimization for Tensor CompletionICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)10.1109/ICASSP43922.2022.9746711(3628-3632)Online publication date: 23-May-2022
    • (2022)Joint Factors and Rank Estimation for the Canonical Polyadic Decomposition Based on Convex OptimizationIEEE Access10.1109/ACCESS.2022.318979310(82295-82304)Online publication date: 2022
    • (2022)Multi‐view side information‐incorporated tensor completionNumerical Linear Algebra with Applications10.1002/nla.248530:5Online publication date: 19-Dec-2022
    • (2021)Tensor Topic Models with Graphs and Applications on Individualized Travel Patterns2021 IEEE 37th International Conference on Data Engineering (ICDE)10.1109/ICDE51399.2021.00320(2756-2761)Online publication date: Apr-2021
    • (2020)Tensor completion-based 5G positioning with partial channel measurementsProceedings of the Twenty-First International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing10.1145/3397166.3413464(339-344)Online publication date: 11-Oct-2020
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media