Pages

Thursday, November 19, 2009

CS: Why Gabor Frames? , Compressive Inverse Scattering II, Compressed Counting, OBIC measurements

I just found the following interesting paper on arxiv: Why Gabor Frames? Two Fundamental Measures of Coherence and their Geometric Significance by Waheed Bajwa, Robert Calderbank, and Sina Jafarpour. The abstract reads:
In the standard compressed sensing paradigm, the N ×C measurement matrix is required to act as a near isometry on all k-sparse signals. This is the Restricted Isometry Property or k-RIP. It is known that certain probabilistic processes generate measurement or sensing matrices that satisfy k-RIP with high probability. However, no polynomial-time algorithm is known for verifying that a sensing matrix with the worst-case coherence μ satisfies k-RIP with k greater than μ^(−1). In contrast, this paper provides simple conditions that, when satisfied, guarantee that a deterministic sensing matrix acts as a near isometry on all but an exponentially small fraction of k-sparse signals. These conditions are defined in terms of the worst-case coherence μ and the expected coherence \nu among the columns of the measurement matrix. Under the assumption that C \ge N^2 and \nu \le N^(−1), the sparsity level k is determined by μ^(−2), while the fraction of “bad” k-sparse signals is determined by \nu^(−2) and μ^(−2). In contrast to the k-RIP condition, these conditions are also extremely easy to check. Applying these conditions to Gabor frames shows that it is possible to successfully recover k-sparse signals for k = O(μ^(−2)). In particular, this implies that Gabor frames generated from the Alltop sequence can successfully recover all but an exponentially small fraction of k-sparse signals for k = O(N).
These might be a little old ( a month or two old)

Compressive Inverse Scattering II. SISO Measurements with Born scatterers by Albert Fannjiang. The abstract reads:
Inverse scattering methods capable of compressive imaging are proposed and analyzed. The methods employ randomly and repeatedly (multiple-shot) the single-input-single-output (SISO) measurements in which the frequency, the chosen incidence and the sampling angle are related in a precise way and are capable of recovering exactly (point of extended) scatterers of sufficiently low sparsity.
For point targets, various sampling techniques are proposed to transform the scattering matrix into the random Fourier matrix. Two schemes are particularly interesting: The first one employs multiple frequencies with the sampling angle always in the back-scattering direction resembling the synthetic aperture (SA) imaging; the second employs only single frequency with the sampling angle in the (nearly) forward scattering direction in the high frequency limit, resembling that of the X-ray tomography. For extended targets, the Littlewood-Paley basis is used in analysis. A specially designed sampling scheme then transforms the scattering matrix into a block-diagonal matrix with each block being the random Fourier matrix corresponding to one of the multiple dyadic scales of the extended target. In other words by the Littlewood-Paley basis and the proposed sampling scheme the different dyadic scales of the target are decoupled and therefore can be reconstructed scale-by-scale by the proposed method.
Estimating Entropy of Data Streams Using Compressed Counting by Ping Li. The abstract reads:
The1 Shannon entropy is a widely used summary statistic, for example, network traffic measurement, anomaly detection, neural computations, spike trains, etc. This study focuses on estimating Shannon entropy of data streams. It is known that Shannon entropy can be approximated by R´enyi entropy or Tsallis entropy, which are both functions of the th frequency moments and approach Shannon entropy as ! 1. Compressed Counting (CC)[24] is a new method for approximating the th frequency moments of data streams. Our contributions include:
• We prove that R´enyi entropy is (much) better than Tsallis entropy for approximating Shannon entropy.
• We propose the optimal quantile estimator for CC, which considerably improves
the estimators in [24].
• Our experiments demonstrate that CC is indeed highly effective in approximating
the moments and entropies. We also demonstrate the crucial importance of utilizing the variance-bias trade-off.
and On the Sample Complexity of Compressed Counting by the same author.

I also found the following on the interwebs (no link to the main paper):

OBIC measurements without lasers or raster-scanning based on compressive sensing by T. Sun, G. L. Woods, Marco Duarte, Kevin Kelly, C. Li, and Yin Zhang. The abstract reads:
Laser-based failure-analysis techniques such as optical beam- induced current (OBIC) or optical beam-induced resistance change (OBIRCH) involve scanning a focused laser beam across a sample by means of a laser scanning microscope (LSM). In this paper, we demonstrate a new method of obtaining OBIC data without requiring a laser or an LSM. Instead, we employ new techniques from the field of compressive sensing (CS). We use an incoherent light source and a spatial light modulator in an image plane of the device under test, supplying a series of pseudo-random on/off illumination patterns (structured illumination) and recording the resulting electrical (photocurrent) signals from the device. Advanced algorithms allow us to reconstruct the signal for the entire die. We present results from OBIC measurements on a discrete transistor and discuss extensions of CS techniques to OBIRCH. We also demonstrate static emission images obtained using CS techniques in which the incoherent light source is replaced with a single-element infrared photon detector so that no detector array is required.

No comments:

Post a Comment