*[*

**Update**: since 2007 when this entry was written, there have been numerous talks put on video on the matter of Compressed Sensing. All the videos are included on this page, including the ones mentioned here.]I just found out about the presentations and videos of these presentations made at the Summer school on Compressive Sampling and Frontiers in Signal Processing that took place at IMA on June 4-15, 2007. They are all in Real format. This is a very nice initiative because there hadn't been too many videos before on the subject of compressed sensing. If you like these, I would suggest that you send a feedback to the organizers of the meeting and thank them for putting these videos online.

Richard Baraniuk

- An introduction to transform coding , Lecture 1 Slides (pdf)
- Compressive sensing for time signals: Analog to information conversion, Lecture 2 Slides (pdf), Talks(A/V) (ram)
- Compressive sensing for detection and classification problems ,Lecture 3 Slides (pdf), Talks(A/V) (ram)
- Multi-signal, distributed compressive sensing Lecture 4 Slides (pdf), Talks(A/V) (ram)
- Compressive imaging with a single pixel camera Lecture 5 Slides (pdf), Talks(A/V) (ram)

- Sparsity, Talks(A/V) (ram)
- After a rapid and glossy introduction to compressive sampling–or compressed sensing as this is also called–the lecture will introduce sparsity as a key modeling tool; the lecture will review the crucial role played by sparsity in various areas such as data compression, statistical estimation and scientific computing.
- Sparsity and the l1 norm, Talks(A/V) (ram)
- In many applications, one often has fewer equations than unknowns. While this seems hopeless, we will show that the premise that the object we wish to recover is sparse or compressible radically changes the problem, making the search for solutions feasible. This lecture discusses the importance of the l1-norm as a sparsity promoting functional and will go through a series of examples touching on many areas of data processing.
- Compressive sampling: sparsity and incoherence , Talks(A/V) (ram)
- Compressed sensing essentially relies on two tenets: the first is that the object we wish to recover is compressible in the sense that it has a sparse expansion in a set of basis functions; the second is that the measurements we make (the sensing waveforms) must be incoherent with these basis functions. This lecture will introduce key results in the field such as a new kind of sampling theorem which states that one can sample a spectrally sparse signal at a rate close to the information rate---and this without information loss.
- The uniform uncertainty principle, Talks(A/V) (ram)
- We introduce a strong form of uncertainty relation and discuss its fundamental role in the theory of compressive sampling. We give examples of random sensing matrices obeying this strong uncertainty principle; e.g. Gaussian matrices.
- The role of probability in compressive sampling, Talks(A/V) (ram)
- This lecture will discuss the crucial role played by probability in compressive sampling; we will discuss techniques for obtaining nonasymptotic results about extremal eigenvalues of random matrices. Of special interest is the role played by high- dimensional convex geometry and techniques from geometric functional analysis such as the Rudelson's selection lemma and the role played by powerful results in the probabilistic theory of Banach space such as Talagrand's concentration inequality.
- Robust compressive sampling and connections with statistics, Talks(A/V) (ram)
- We show that compressive sampling is–perhaps surprisingly–robust vis a vis modeling and measurement errors.
- Robust compressive sampling and connections with statistics (continued) Talks(A/V) (ram)
- We show that accurate estimation from noisy undersampled data is sometimes possible and connect our results with a large literature in statistics concerned with high dimensionality; that is, situations in which the number of observations is less than the number of parameters.
- Connections with information and coding theory Talks(A/V) (ram)
- We morph compressive sampling into an error correcting code, and explore the implications of this sampling theory for lossy compression and some of its relationship with universal source coding.
- Modern convex optimization Talks(A/V) (ram)
- We will survey the literature on interior point methods which are very efficient numerical algorithms for solving large scale convex optimization problems.
- Applications, experiments and open problems Talks(A/V) (ram)
- We discuss several applications of compressive sampling in the area of analog-to-digital conversion and biomedical imaging and review some numerical experiments in new directions. We conclude by exposing the participants to some important open problems

- Signal encoding Lecture 1 Slides (pdf), Talks(A/V) (ram)
- Shannon-Nyquist Theory, Pulse Code Modulation, Sigma-Delta Modulation, Kolmogorov entropy, optimal encoding.
- Compression Lecture 2 Slides (pdf), Talks(A/V) (ram)
- Best k-term approximation for bases and dictionaries, decay rates, approximation classes, application to image compression via wavelet decompositions.
- Discrete compressed sensing , Lecture 3 Slides (pdf), Talks(A/V) (ram)
- The problem, best matrices for classes, Gelfand widths and their connection to compressed sensing.
- The restricted isometry property (RIP) , Talks(A/V) (ram)
- Performance of compressed sensing under RIP.
- Construction of CS matrices with best RIP , Talks(A/V) (ram)
- Bernoulli and Gaussian random variables.
- Performance of CS matrices revisited Lecture 6 Slides (pdf), Talks(A/V) (ram)
- Proofs of the Kashin-Gluskin theorems.
- Performance in probability, Lecture 7 Slides (pdf), Talks(A/V) (ram)
- Examples of performance for Gaussian and Bernoulli ensembles.
- Decoders , Lecture 8 Slides (pdf), Talks(A/V) (ram)
- l1 minimization, greedy algorithms, iterated least squares.
- Performance of iterated least squares Paper (pdf), Talks(A/V) (ram)
- Convergence and exponential convergence.
- Deterministic constructions of CS Matrices , Lecture 10 Slides (pdf), Talks(A/V) (ram)
- Constructions from finite fields, circulant matrices.

- Algorithms for Compressed Sensing, I Slides (pdf), Talks(A/V) (ram)
- What algorithmic problem do we mean by Compressed Sensing? There are a variety of alternatives, each with different algorithmic solutions (both theoretical and practical). I will discuss some of the different types of results from the combinatorial to the probabilistic.
- Algorithms for Compressed Sensing, II Lecture notes (pdf) , Talks(A/V) (ram)
- What do these algorithms all have in common? What are the common goals of the problems and how do they achieve them? I will discuss several known techniques and open problems.

- Welcome and introduction, Talks(A/V) (ram)

Leon Axel (New York University)

, Steen Moeller (University of Minnesota)

- Introduction to MRI Slides (pdf), Slides (ppt), Talks(A/V) (ram)

Discussion (A/V) (ram)

Discussion (A/V) (ram)

Short presentations by participants

- Short presentation by participants 1 (ram)
- Short presentation by participants 2 (ram)
- Short presentation by participants 3 (ram)
- Short presentation by participants 4 (ram)
- Short presentation by participants 5 (ram)

Liked this entry ? subscribe to the Nuit Blanche feed, there's more where that came from

If you think this blog provides a service, please support it by ordering through the Amazon - Nuit Blanche Reference Store

## 3 comments:

HI,

Can you please have slides of E. Candes as well.

Thanks

I think it would be a good idea to ask him directly.

Igor.

Hey

Maybe it would be good to date each of the lectures, because some lectures make reference to some others, so dating them in the order they where presented may give a better sense of reality

Just a suggestion

Post a Comment