When reading the materials of Piotr Indyk's course entitled: Sketching, Streaming and Sub-linear Space Algorithms, I was reminded I had not mentioned this important paper by Venkatesan Guruswami, James Lee, and Alexander Razborov entitled Almost Euclidean subspaces of ell-1-N via expander codes. The abstract reads:
We give an explicit (in particular, deterministic polynomial time) construction of subspaces X included in of dimension such that for every x element of X,
if we are allowed to use random bits and for any fixed constant , the lower bound can be further improved to . Our construction makes use of unbalanced bipartite graphs to impose local linear constraints on vectors in the subspace, and our analysis relies on expansion properties of the graph. This is inspired by similar constructions of error-correcting codes.
Of interest is the note entitled relationship to compressed sensing before paragraph 2. A historical view making sense of this result (and that of Piotr Indyk's Uncertainty Principles, Extractors, and Explicit Embeddings of L2 into L1) was given in this guest lecture by Venkatesan Guruswami entitled Euclidean Subspaces and Compressed Sensing. To a dim witted person like me it certainly is enlighting. I have added the paper in the list of deterministic measurement matrix of the big picture on CS.
[Un resume des resultats de ce papier se trouve ici]
I also discovered the course entitled Compressive Sensing by Mark Davenport, Richard Baraniuk, and Ronald DeVore where I finally learned something about the Gelfand n-widths. Anyway, the course content reads as:
[Un resume des resultats de ce papier se trouve ici]
I also discovered the course entitled Compressive Sensing by Mark Davenport, Richard Baraniuk, and Ronald DeVore where I finally learned something about the Gelfand n-widths. Anyway, the course content reads as:
- Introduction
- The Shannon-Whitaker Sampling Theorem
- Optimal Encoding
- Kolmogorov Entropy
- Optimal Encoding of Bandlimited Signals
- Stable Signal Representations
- Preliminaries
- New Signal Models
- Sparse Approximation and ℓp Spaces
- Thresholding and Greedy Bases
- Greedy Algorithms
No comments:
Post a Comment