If wish I had known about the Johnson-Lindenstrauss Lemma or the Taken's embedding theorem or even surfing on manifolds to do things faster much earlier than I did. For the sake of putting things in context, here is the connection between compressive sensing and the Johnson-Lindenstrauss lemma (from The Johnson-Lindenstrauss Lemma Meets Compressed Sensing):
...In Compressed Sensing (CS) [2, 7], for example, a random projection of a highdimensional but sparse signal vector onto a lower-dimensional space has been shown, with high probability, to contain enough information to enable signal reconstruction with small or zero error. Random projections also play a fundamental role in the study of point clouds in high-dimensional spaces. The Johnson-Lindenstrauss (JL) lemma [12], for example, shows that with high probability the geometry of a point cloud is not disturbed by certain Lipschitz mappings onto a space of dimension logarithmic in the number of points. The statement and proofs of the JL-lemma have been simpliļ¬ed considerably by using random linear projections and concentration inequalities [1]....
Equally, why should we care about Takens' embedding theorem ? here is some insight (from the first paper) as to why we should care:
"...The underlying problem is that while Takens’ theorem guarantees the preservation of the attractor’s topology, it does not guarantee that the geometry of the attractor is also preserved. To be precise, Takens’ result guarantees that two points on the attractor do not map to the same point in the reconstruction space, but there are no guarantees that close points on the attractor remain close under this mapping (or far points remain far). Consequently, relatively small imperfections could have arbitrarily large, unwanted effects when the delay coordinate map is used in applications..."
The second paper takes on a somewhat related route, where, beyond defining specific metrics for specific objects like Toeplitz or psd matrices and whatnot, the next step is to use those to perform some computations faster ( see also some of these recent comments)
A First Analysis of the Stability of Takens' Embedding by Han Lun Yap, Armin Eftekhari, Michael B. Wakin, Christopher J. Rozell
Takens' Embedding Theorem asserts that when the states of a hidden dynamical system are confined to a low-dimensional attractor, complete information about the states can be preserved in the observed time-series output through the delay coordinate map. However, the conditions for the theorem to hold ignore the effects of noise and time-series analysis in practice requires a careful empirical determination of the sampling time and number of delays resulting in a number of delay coordinates larger than the minimum prescribed by Takens' theorem. In this paper, we use tools and ideas in Compressed Sensing to provide a first theoretical justification for the choice of the number of delays in noisy conditions. In particular, we show that under certain conditions on the dynamical system, measurement function, number of delays and sampling time, the delay-coordinate map can be a stable embedding of the dynamical system's attractor.
Riemannian preconditioning by Bamdev Mishra, Rodolphe Sepulchre
The paper exploits a basic connection between sequential quadratic programming and Riemannian gradient optimization to address the general question of selecting a metric in Riemannian optimization. The proposed method is shown to be particularly insightful and efficient in quadratic optimization with orthogonality and/or rank constraints, which covers most current applications of Riemannian optimization in matrix manifolds.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment