Abstract
Principal Component Analysis (PCA) is one of the most widely used algorithmic techniques. When is PCA provably effective? What are its main limitations and how can we get around them? In this note, we discuss three specific challenges.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Achlioptas, D., McSherry, F.: On Spectral Learning of Mixtures of Distributions. In: Auer, P., Meir, R. (eds.) COLT 2005. LNCS (LNAI), vol. 3559, pp. 458–469. Springer, Heidelberg (2005)
Arora, S., Kannan, R.: Learning mixtures of arbitrary gaussians. Annals of Applied Probability 15(1A), 69–92 (2005)
Belkin, M., Sinha, K.: Polynomial learning of distribution families. In: FOCS, pp. 103–112 (2010)
Belkin, M., Sinha, K.: Toward learning gaussian mixtures with arbitrary separation. In: COLT, pp. 407–419 (2010)
Brubaker, S.C.: Robust pca and clustering on noisy mixtures. In: Proc. of SODA (2009)
Brubaker, S.C., Vempala, S.: Isotropic pca and affine-invariant clustering. In: Grötschel, M., Katona, G. (eds.) Building Bridges Between Mathematics and Computer Science. Bolyai Society Mathematical Studies, vol. 19 (2008)
Chaudhuri, K., Rao, S.: Learning mixtures of product distributions using correlations and independence. In: Proc. of COLT (2008)
Comon, P.: Independent Component Analysis. In: Proc. Int. Sig. Proc. Workshop on Higher-Order Statistics, Chamrousse, France, July 10-12, pp. 111–120 (1991); Keynote address. Republished in Lacoume, J.L. (ed.): Higher-Order Statistics, pp 29–38. Elsevier (1992)
DasGupta, S.: Learning mixtures of gaussians. In: Proc. of FOCS (1999)
DasGupta, S., Schulman, L.: A two-round variant of em for gaussian mixtures. In: Proc. of UAI (2000)
Eckart, C., Young, G.: The approximation of one matrix by another of lower rank. Psychometrika 1(3), 211–218 (1936)
Frieze, A., Jerrum, M., Kannan, R.: Learning linear transformations. In: FOCS, pp. 359–368 (1996)
Jutten, C., Herault, J.: Blind separation of sources, part i: An adaptive algorithm based on neuromimetic architecture. Signal Processing 24(1), 1–10 (1991)
Kalai, A.T., Moitra, A., Valiant, G.: Efficiently learning mixtures of two gaussians. In: STOC, pp. 553–562 (2010)
Kannan, R., Salmasian, H., Vempala, S.: The spectral method for general mixture models. SIAM Journal on Computing 38(3), 1141–1156 (2008)
Kannan, R., Vempala, S.: Spectral algorithms. Foundations and Trends in Theoretical Computer Science 4(3-4), 157–288 (2009)
Lacoume, J.-L., Ruiz, P.: Separation of independent sources from correlated inputs. IEEE Transactions on Signal Processing 40(12), 3074–3078 (1992)
Moitra, A., Valiant, G.: Settling the polynomial learnability of mixtures of gaussians. In: FOCS, pp. 93–102 (2010)
Vempala, S., Wang, G.: A spectral algorithm for learning mixtures of distributions. Journal of Computer and System Sciences 68(4), 841–860 (2004)
Vempala, S., Xiao, Y.: Structure from local optima: Learning subspace juntas via higher order pca. CoRR, abs/1108.3329 (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Vempala, S.S. (2012). Effective Principal Component Analysis. In: Navarro, G., Pestov, V. (eds) Similarity Search and Applications. SISAP 2012. Lecture Notes in Computer Science, vol 7404. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32153-5_1
Download citation
DOI: https://doi.org/10.1007/978-3-642-32153-5_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32152-8
Online ISBN: 978-3-642-32153-5
eBook Packages: Computer ScienceComputer Science (R0)