Abstract
This note presents a unified analysis of the recovery of simple objects from random linear measurements. When the linear functionals are Gaussian, we show that an s-sparse vector in \({\mathbb{R}^n}\) can be efficiently recovered from 2s log n measurements with high probability and a rank r, n × n matrix can be efficiently recovered from r(6n − 5r) measurements with high probability. For sparse vectors, this is within an additive factor of the best known nonasymptotic bounds. For low-rank matrices, this matches the best known bounds. We present a parallel analysis for block-sparse vectors obtaining similarly tight bounds. In the case of sparse and block-sparse signals, we additionally demonstrate that our bounds are only slightly weakened when the measurement map is a random sign matrix. Our results are based on analyzing a particular dual point which certifies optimality conditions of the respective convex programming problem. Our calculations rely only on standard large deviation inequalities and our analysis is self-contained.
References
Achlioptas, D.: Database-friendly random projections: Johnson-Lindenstrauss with binary coins. J. Comput. Syst. Sci. 66(4), 671–687 (2003). Special issue of invited papers from PODS’01
Candès E., Recht B.: Exact matrix completion via convex optimization. Found. Comut. Math. 9(6), 717–772 (2009)
Candès E.J., Plan Y.: Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. IEEE Trans. Inf. Theory 57(4), 2342–2359 (2011)
Candès E.J., Romberg J., Tao T.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52(2), 489–509 (2006)
Chandrasekaran, V., Recht, B., Parrilo, P.A., Willsky, A.: The convex geometry of linear inverse problems. Submitted for publication. Preprint available at arxiv.org/1012.0621 (2010)
Davidson K.R., Szarek S.J.: Local operator theory, random matrices and Banach spaces. In: Johnson, W.B., Lindenstrauss, J. (eds.) Handbook on the Geometry of Banach Spaces, pp. 317–366. Elsevier, Amsterdam (2001)
Donoho D., Tanner J.: Counting faces of randomly-projected polytopes when the projection radically lowers dimension. J. Am. Math. Soc. 22(1), 1–53 (2009)
Donoho D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)
Eldar, Y.C., Bolcskei, H.: Block-sparsity: coherence and efficient recovery. In: ICASSP, The International Conference on Acoustics, Signal and Speech Processing (2009)
Fuchs J.J.: On sparse representations in arbitrary redundant bases. IEEE Trans. Inf. Theory 50, 1341–1344 (2004)
Gordon Y.: On Milman’s inequality and random subspaces which escape through a mesh in \({\mathbb{R}^n}\). In: Lindenstrauss, J., Milman, V.D. (eds.) Geometric Aspects of Functional Analysis, Lecture Notes in Mathematics, vol. 1317, pp. 84–106. Springer, Berlin (1988)
Hoeffding W.: Probability inequalities for sums of bounded random variables. J. Am. Stat. Assoc. 58(301), 13–30 (1963)
Laurent B., Massart P.: Adaptive estimation of a quadratic functional by model selection. Ann. Stat. 28(5), 1302–1338 (2000)
Mardia K.V., Kent J.T., Bibby J.M.: Multivariate Analysis. Academic Press, London (1979)
Negahban, S., Ravikumar, P., Wainwright, M.J., Yu, B.: A unified framework for high-dimensional analysis of m-estimators with decomposable regularizers. In: Advances in Neural Information Processing Systems, conference proceedings (2009)
Oymak, S., Hassibi, B.: New null space results and recovery thresholds for matrix rank minimization. Submitted for publication. Preprint available at arxiv.org/abs/1011.6326 (2010)
Parvaresh, F., Hassibi, B.: Explicit measurements with almost optimal thresholds for compressed sensing. In: ICASSP, The International Conference on Acoustics, Signal and Speech Processing, C (2008)
Rao, N., Recht, B., Nowak,R.: Universal measurement bounds for structured sparse signal recovery. In: Proceedings of AISTATS (2012)
Recht B., Fazel M., Parrilo P.: Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)
Stojnic, M.: Various thresholds for ℓ 1-optimization in compressed sensing. Preprint available at arxiv.org/abs/0907.3666 (2009)
Vershynin, R.: Introduction to the non-asymptotic analysis of random matrices. In: Eldar, Y.C., Kutyniok, G. (eds.) Compressed Sensing: Theory and Applications. Cambridge University Press, Cambridge. To appear. Preprint available at http://www-personal.umich.edu/~romanv/papers/papers.html
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Candès, E., Recht, B. Simple bounds for recovering low-complexity models. Math. Program. 141, 577–589 (2013). https://doi.org/10.1007/s10107-012-0540-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-012-0540-0