Pages

Saturday, August 30, 2008

CS: FRI, CS based estimation of doubly selective channels, Deterministic Dictionaries, CC, Several talks/videos and a IEEE CfP on CS

Thanks to my webcrawler, here is what I have found in the past two days:

An article that might trigger beginning electrical engineers in dabbing in compressive sensing:
A Generalized Sampling Method for Finite-Rate-of-Innovation-Signal Reconstruction by Chandra Sekhar Seelamantula, Michael Unser. The abstract reads:

The problem of sampling signals that are not admissible within the classical Shannon framework has received much attention in the recent past. Typically, these signals have a parametric representation with a finite number of degrees of freedom per time unit. It was shown that, by choosing suitable sampling kernels, the parameters can be computed by employing high-resolution spectral estimation techniques. In this paper, we propose a simple acquisition and reconstruction method within the framework of multichannel sampling. In the proposed approach, an infinite stream of nonuniformly-spaced Dirac impulses can be sampled and accurately reconstructed provided that there is at most one Dirac impulse per sampling period. The reconstruction algorithm has a low computational complexity and the parameters are computed on the fly. The processing delay is minimal—just the sampling period. We propose sampling circuits using inexpensive passive devices such as resistors and capacitors. We also show how the approach can be extended to sample piecewise-constant signals with a minimal change in the system configuration. We provide some simulation results to confirm the theoretical findings.
Here is an EUSIPCO paper made available entitled: Compressed sensing based estimation of doubly selective channels using a sparsity-optimized basis expansion by Georg Tauböck and Franz Hlawatsch. The abstract reads:

Most transmission schemes for MIMO channels assume (i) a block fading channel without delay spread and (ii) availability of channel state information at the receiver. Here, we extend the space-time matrix modulation (STMM) scheme and the iterative demodulation algorithm that we introduced previously to unknown, doubly selective MIMO channels, i.e., delayDoppler-spread MIMO channels that are unknown to the receiver. We show that the structure inherent in STMM allows perfect reconstruction of the data when transmitting over an unknown doubly selective channel (apparently, this is not currently possible with other transmission schemes). Numerical simulations demonstrate significant performance advantages of STMM over Cayley differential unitary space-time modulation.

Of relevance to dictionary building here is: On some deterministic dictionaries supporting sparsity by Shamgar Gurevich, Ronny Hadani, Nir Sochen. The abstract reads:

We describe a new construction of an incoherent dictionary, referred to as the oscillator dictionary, which is based on considerations in the representation theory of finite groups. The oscillator dictionary consists of order of p^5 unit vectors in a Hilbert space of dimension p, where p is an odd prime, whose pairwise inner products have magnitude of at most 4/sqrt(p). An explicit algorithm to construct a large portion of the oscillator dictionary is presented.
Here is one paper that keeps on raising my interest, i.e. I keep on going back to it often. If you know how to count, then you might be interested in the concept of compressed counting in A Very Efficient Scheme for Estimating Entropy of Data Streams Using Compressed Counting
by Ping Li. The abstract reads:

Compressed Counting (CC) was recently proposed for approximating the αth frequency moments of data streams, for , especially α ≈ 1. One direct application of CC is to estimate the entropy, which is an important summary statistic in Web/network measurement and often serves a crucial “feature”for data mining. The R´enyi entropy and the Tsallis entropy are functions of the αth frequency moments; and both approach the Shannon entropy as α → 1. Previous studies used the standard algorithm based on symmetric stable random projections to approximate the αth frequency moments and the entropy. Based on maximally-skewed stable random projections, Compressed Counting (CC) dramatically improves symmetric stable random projections, especially when α ≈ 1. This study applies CC to estimate the R´enyi entropy, the Tsallis entropy, and the Shannon entropy. Our experiments on some Web crawl data demonstrate significant improvements over previous studies. When estimating the frequency moments, the R´enyi entropy, and the Tsallis entropy, the improvements of CC, in terms of accuracy, appear to approach “infinity” as α → 1. When estimating Shannon entropy using R´enyi entropy or Tsallis entropy, the improvements of CC, in terms of accuracy, are roughly 20- to 50-fold. When estimating the Shannon entropy from R´enyi entropy, in order to achieve the same accuracy, CC only needs about 1/50 of the samples (storage space), compared to symmetric stable random projections. When estimating the Shannon entropy from Tsallis entropy, however, symmetric stable random projections can not achieve the same accuracy as CC, even with 500 times more samples.
The following are a list of talks/presentation and attendant videos of relevance to either compressive sensing, l1 minimization and related subjects:

Lawrence Carin will talk about Exploiting Structure in Statistical Compressive-Sensing Inversion on Thursday, September 25, 2008, from 4:00 PM to 5:00 PM at Rice.
Location: 1070 Duncan Hall, Rice University, 6100 Main St, Houston, Texas, USA
The abstract of the presentation reads:
Compressive sensing involves performing sensing measurements based on projections, with the projection vectors typically constituted randomly. It is assumed that the underlying signal is compressible in a particular orthonormal basis, and the objective is to infer the sparse set of transform coefficients from the projection measurements. One therefore must solve an inverse problem to recover the underlying signal from the associated projection measurements. We examine this problem from a statistical standpoint, inferring a posterior distribution on the underlying signal. A focus is placed on non-parametric Bayesian formalisms, wherein several forms of structure within the data are exploited, to enhance inversion quality. This includes leveraging inter-relationships between multiple compressive-sensing measurements, as well as structure within the underlying data itself (e.g., structure within the transform coefficients, particularly for a wavelet basis). This talk with describe the underlying theory associated with the statistical methods, and will present several examples.

Videolectures.net seems to cover ICML '08, yet I could not find this presentation: Y. Qi, D. Liu, D.B. Dunson and L. Carin, Multi-Task Compressive Sensing with Dirichlet Process Priors, to appear in Int. Conf. Machine Learning, 2008. But Lawrence Carin's other presentation at ICML entitled:


Hierarchical Kernel Stick-Breaking Process for Multi-Task Image Analysis
is there.






L1-based relaxations for sparsity recovery and graphical model selection in the high-dimensional regime
by Martin J. Wainwright. The abstract of the talk reads:

The problem of estimating a sparse signal embedded in noise arises in various contexts, including signal denoising and approximation, as well as graphical model selection. The natural optimization-theoretic formulation of such problems involves "norm" constraints (i.e., penalties on the number of non-zero coefficients), which leads to NP-hard problems in general. A natural approach is to consider the use of the l1-norm as a computationally tractable surrogate, as has been pursued in both signal processing and statistics.

I recently came across the interesting talks of Roman Vershynin, they are very clear for an engineer like me:


Finally, here is a Call for Paper on the subject of Compressive Sensing.

Call for Papers, IEEE Signal Processing Society, IEEE Journal of Selected Topics in Signal Processing: Special Issue on Compressive Sensing. The introduction states:

At its core, the young field of compressive sensing (CS) is nothing more than the ability of simple algorithms to solve the seemingly intractable problem of finding the sparsest solution of a severely underdetermined linear system of equations, under realistic conditions. This leads immediately to new signal reconstruction methods that are successful with surprisingly few measurements, which in turn leads to signal acquisition methods that effect compression as part of the measurement process (hence “compressive sensing”). These recent realizations (though built upon prior work exploiting signal sparsity) have spawned an explosion of research yielding exciting results in a wide range of topics, encompassing algorithms, theory, and applications.
Original papers, previously unpublished and not currently under review by another journal, are solicited for this special issue. The scope of this special issue includes, but is not limited to:
  • algorithms for CS
  • CS methods that are tolerant to noise, signal non-sparsity, or measurement nonlinearity
  • measurement/sampling procedures
  • mathematical theory of CS
  • CS for multiple signals or with additional information
  • CS for analog signals
  • signal processing of compressive measurements
  • nonadaptive signal compression or streaming dataset reduction
  • hardware implementation of CS systems
  • applications of CS
Submission information is available at http://www.ece.byu.edu/jstsp. Prospective authors are required to follow the Author’s Guide for manuscript preparation of the IEEE Transactions on Signal Processing at http://ewh.ieee.org/soc/sps/tsp. Manuscripts will be peer reviewed according to the standard IEEE process.
  • Manuscript submission due: Feb. 20, 2009
  • First review completed: May 15, 2009
  • Revised manuscript due: Jul. 1, 2009
  • Second review completed: Sep. 15, 2009
  • Final manuscript due: Oct. 15, 2009

Credit: NASA/JPL/University of Arizona
, Layered Deposits within Unnamed Crater in Arabia Terra (PSP_009180_1840).

No comments:

Post a Comment