Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

An Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform

Published: 01 June 2013 Publication History

Abstract

The problems of random projections and sparse reconstruction have much in common and individually received much attention. Surprisingly, until now they progressed in parallel and remained mostly separate. Here, we employ new tools from probability in Banach spaces that were successfully used in the context of sparse reconstruction to advance on an open problem in random pojection. In particular, we generalize and use an intricate result by Rudelson and Veshynin [2008] for sparse reconstruction which uses Dudley’s theorem for bounding Gaussian processes. Our main result states that any set of N = exp(Õ(n)) real vectors in n dimensional space can be linearly mapped to a space of dimension k = O(log N polylog(n)), while (1) preserving the pairwise distances among the vectors to within any constant distortion and (2) being able to apply the transformation in time O(n log n) on each vector. This improves on the best known bound N = exp(Õ(n1/2)) achieved by Ailon and Liberty [2009] and N = exp(Õ(n1/3)) by Ailon and Chazelle [2010]. The dependence in the distortion constant however is suboptimal, and since the publication of an early version of the work, the gap between upper and lower bounds has been considerably tightened obtained by Krahmer and Ward [2011]. For constant distortion, this settles the open question posed by these authors up to a polylog(n) factor while considerably simplifying their constructions.

References

[1]
Dimitris Achlioptas. 2003. Database-friendly random projections: Johnson-Lindenstrauss with binary coins. J. Comput. Syst. Sci. 66, 4, 671--687.
[2]
Nir Ailon and Bernard Chazelle. 2006. Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform. In Proceedings of the 38th Annual Symposium on the Theory of Compututing (STOC). 557--563.
[3]
Nir Ailon and Bernard Chazelle. 2010. Faster dimension reduction. Comm. ACM 53, 2, 97--104.
[4]
Nir Ailon and Edo Liberty. 2009. Fast dimension reduction using Rademacher series on dual BCH codes. Discr. Comput. Geom. 42, 4, 615--630.
[5]
Noga Alon. 2003. Problems and results in extremal combinatorics--I. Disc. Math. 273, 1--3, 31--53.
[6]
R. G. Baraniuk, M. A. Davenport, R. A. DeVore, and M. B. Wakin. 2008. A simple proof of the restricted isometry property for random matrices. Construct. Approx. 28, 3, 253--263.
[7]
Alfred M. Bruckstein, David L. Donoho, and Michael Elad. 2009. From sparse solutions of systems of equations to sparse modeling of signals and images. SIAM Rev. 51, 1, 34--81.
[8]
Emmanuel J. Candès, Justin K. Romberg, and Terence Tao. 2006. Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52, 2, 489--509.
[9]
Kenneth L. Clarkson and David P. Woodruff. 2009. Numerical linear algebra in the streaming model. In Proceedings of STOC. 205--214.
[10]
Anirban Dasgupta, Ravi Kumar, and Tamás Sarlós. 2010. A sparse Johnson-Lindenstrauss transform. In Proceedings of the 42nd ACM Symposium on Theory of Computing (STOC).
[11]
S. Dasgupta and A. Gupta. 1999. An elementary proof of the Johnson-Lindenstrauss lemma. Tech. rep., UC Berkeley, 99--006.
[12]
David L. Donoho. 2006. Compressed sensing. IEEE Trans. Inf. Theory 52, 4, 1289--1306.
[13]
P. Frankl and H. Maehara. 1987. The Johnson-Lindenstrauss lemma and the sphericity of some graphs. J. Combin. Theory Ser. A 44, 355--362.
[14]
Aicke Hinrichs and Jan Vybíral. 2011. Johnson-Lindenstrauss lemma for circulant matrices. Rand. Struct. Algor. 39, 3, 391--398.
[15]
T. S. Jayram and David Woodruff. 2011. Optimal bounds for Johnson Lindenstrauss transforms and streaming problems with sub-constant error. In Proceedings of the 22nd Annual ACM-SIAM Symposium on Discrete Algorithms (SODA’11). 1--10.
[16]
William B. Johnson and Joram Lindenstrauss. 1984. Extensions of Lipschitz mappings into a Hilbert space. Contemp. Math. 26, 189--206.
[17]
Felix Krahmer and Rachel Ward. 2011. New and improved Johnson-Lindenstrauss embeddings via the restricted isometry property. SIAM J. Math. Anal. 43, 1269--1281.
[18]
Michel Ledoux and Michel Talagrand. 1991. Probability in Banach Spaces: Isoperimetry and Processes. Springer-Verlag.
[19]
Edo Liberty, Nir Ailon, and Amit Singer. 2008. Dense fast random projections and Lean Walsh transforms. In Proceedings of APPROX-RANDOM. 512--522.
[20]
Jirí Matousek. 2008. On variants of the Johnson-Lindenstrauss lemma. Rand. Struct. Algor. 33, 2, 142--156.
[21]
Gilles Pisier. 1989. The Volume of Convex Bodies and Banach Space Geometry. Number 94 in Cambridge Tracts in Mathematics. Cambridge University Press.
[22]
Mark Rudelson and Roman Veshynin. 2008. On sparse reconstruction from Fourier and Gaussian measurements. Commun. Pure Appl. Math. 61, 1025--1045.
[23]
Tamás Sarlós. 2006. Improved approximation algorithms for large matrices via random projections. In Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS’06). 143--152.
[24]
Jan Vybral. 2011. A variant of the Johnson Lindenstrauss lemma for circulant matrices. J. Funct. Anal. 260, 4, 1096--1105.
[25]
Franco Woolfe, Edo Liberty, Vladimir Rokhlin, and Mark Tygert. 2008. A fast randomized algorithm for the approximation of matrices. Appl. Computat. Harm. Anal. 25, 3, 335--366.

Cited By

View all
  • (2024)Bridging Dense and Sparse Maximum Inner Product SearchACM Transactions on Information Systems10.1145/366532442:6(1-38)Online publication date: 19-Aug-2024
  • (2023)Dense-exponential random featuresProceedings of the 37th International Conference on Neural Information Processing Systems10.5555/3666122.3666168(957-985)Online publication date: 10-Dec-2023
  • (2023)Simplex random featuresProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3619608(28864-28888)Online publication date: 23-Jul-2023
  • Show More Cited By

Index Terms

  1. An Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform

    Recommendations

    Reviews

    Guangwu Xu

    This paper deals with the randomized construction of efficiently computable Johnson-Lindenstrauss transforms. The paper greatly improves previous results in the literature by establishing an almost optimal result in terms of the dimension reduced and the complexity when applying such transforms. Given a finite subset Y R n of size N , a Johnson-Lindenstrauss transform is a map : R n R k , with the dimension k being much smaller than n , that approximately preserves the relative distances between any two elements in Y in the sense that (1- )|| y 1- y 2||22 || ( y 1)- ( y 2)||22 (1+ )|| y 1- y 2||22, with a fixed distortion parameter . Several random constructions of linear Johnson-Lindenstrauss transforms have been established that are optimal in the parameter k , that is, . The authors of this paper, as well as Bernard Chazelle, study the fast Johnson-Lindenstrauss transform. They are interested in constructing a distribution of k n matrices , such that, in addition to approximately preserving distances for any pair of vectors in Y , the vector x can be computed in O ( n log n ) steps for each vector x . The main result of this paper is to prove that such a class of matrices can be obtained when the cardinality of Y is N = e ( n ). This greatly improves previous results where the cardinality of Y was (due to Ailon and Chazelle [1]) and (due to Ailon and Liberty [2]). The reduced dimension k is bounded by . The key ingredient in proving this result is to refine Rudelson and Veshynin's powerful technique for constructing a compressed sensing matrix from an n n orthogonal matrix (with entries of magnitude ) by randomly selecting k rows. The refinement enables the authors to treat both sparse vectors and vectors with small norms. Each of the Johnson-Lindenstrauss transforms constructed in this paper is of the following form: it is the product of a matrix whose k rows are drawn uniformly at random from the n n Hadamard matrix, and an n n diagonal matrix with each diagonal element drawn uniformly from the set {-1, 1}. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Algorithms
    ACM Transactions on Algorithms  Volume 9, Issue 3
    Special Issue on SODA'11
    June 2013
    167 pages
    ISSN:1549-6325
    EISSN:1549-6333
    DOI:10.1145/2483699
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 June 2013
    Accepted: 01 December 2012
    Revised: 01 December 2012
    Received: 01 June 2011
    Published in TALG Volume 9, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Johnson-Lindenstrauss
    2. compressed sensing
    3. restricted isometry

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)27
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 10 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Bridging Dense and Sparse Maximum Inner Product SearchACM Transactions on Information Systems10.1145/366532442:6(1-38)Online publication date: 19-Aug-2024
    • (2023)Dense-exponential random featuresProceedings of the 37th International Conference on Neural Information Processing Systems10.5555/3666122.3666168(957-985)Online publication date: 10-Dec-2023
    • (2023)Simplex random featuresProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3619608(28864-28888)Online publication date: 23-Jul-2023
    • (2023)On exact and robust recovery for plug-and-Play compressed sensingSignal Processing10.1016/j.sigpro.2023.109100211(109100)Online publication date: Oct-2023
    • (2023)$$\varepsilon $$-Isometric Dimension Reduction for Incompressible Subsets of $$\ell _p$$Discrete & Computational Geometry10.1007/s00454-023-00587-w71:1(160-176)Online publication date: 21-Oct-2023
    • (2022)COX : Exposing CUDA Warp-level Functions to CPUsACM Transactions on Architecture and Code Optimization10.1145/355473619:4(1-25)Online publication date: 16-Sep-2022
    • (2022)Obesity Prediction with EHR Data: A Deep Learning Approach with Interpretable ElementsACM Transactions on Computing for Healthcare10.1145/35067193:3(1-19)Online publication date: 7-Apr-2022
    • (2022)Identifying Museum Visitors via Social Network Analysis of InstagramJournal on Computing and Cultural Heritage 10.1145/350563515:3(1-19)Online publication date: 16-Sep-2022
    • (2022)Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians ClusteringSIAM Journal on Computing10.1137/20M133070152:2(STOC19-269-STOC19-297)Online publication date: 14-Mar-2022
    • (2022)Tuning Database-Friendly Random Projection Matrices for Improved Distance Preservation on Specific DataApplied Intelligence10.1007/s10489-021-02626-652:5(4927-4939)Online publication date: 1-Mar-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media