Tuesday, March 26, 2013

Around the blogs in 78 hours

Dustin has a challenge on his blog. Laurent Jacques is back to writing electronically. Hein delivers a series of interesting entries. Christian wonders about the use of random projections and more. The list of entries is followed by some of the papers discussed in them. Also, recently we heard about Lockheed Martin using quantum computers as part of their business to develop radars. I don't know what to think of it but to get some context, you first have to read Hein's blog entry on “Quantum information and the Brain” which features a the NIPS talks by Scott Aronson, the idea that deterministic compressive sensing is around the corner and finally the recent Sunday Morning Insight on Quantum Computing and the Steamrollers. Without further due, here are the exciting news from the interwebs:


Christian
"...David Dunson presented some very recent work on compressed sensing that summed up for me into the idea of massively projecting (huge vectors of) regressors into much smaller dimension convex combinations, using random matrices for the projections. This point was somehow unclear to me. And to the first discussant Michael Wiper as well, who stressed that a completely random selection of those matrices could produce “mostly rubbish”, unless a learning mechanism was instated. The second discussant, Peter Müller, made the same point about this completely random search in a huge dimension space, while considering the survival frequency of covariates could help towards the efficiency of the method...."

Djalil
Pip discussed a linear algebra approach to the Erdős Discrepancy Problem in
Dick
John tells us about the
Laurent mentions a panorama
Hein
2Physics
Bob
Dustin
Dirk
The entries mentioned:



Bayesian Compressed Regression by Rajarshi Guhaniyogi, David B. Dunson

As an alternative to variable selection or shrinkage in high dimensional regression, we propose to randomly compress the predictors prior to analysis. This dramatically reduces storage and computational bottlenecks, performing well when the predictors can be projected to a low dimensional linear subspace with minimal loss of information about the response. As opposed to existing Bayesian dimensionality reduction approaches, the exact posterior distribution conditional on the compressed data is available analytically, speeding up computation by many orders of magnitude while also bypassing robustness issues due to convergence and mixing problems with MCMC. Model averaging is used to reduce sensitivity to the random projection matrix, while accommodating uncertainty in the subspace dimension. Strong theoretical support is provided for the approach by showing near parametric convergence rates for the predictive density in the large p small n asymptotic paradigm. Practical performance relative to competitors is illustrated in simulations and real data applications.

On the optimality of a L1/L1 solver for sparse signal recovery from sparsely corrupted compressive measurements by Laurent Jacques
Abstract: This short note proves the instance optimality of a solver, i.e., a variant of basis pursuit denoising with a -fidelity constraint, when applied to the estimation of sparse (or compressible) signals observed by sparsely corrupted compressive measurements. The approach simply combines two known results due to Y. Plan, R. Vershynin and E. Candès. 


A panorama on multiscale geometric representations, intertwining spatial, directional and frequency selectivity by Laurent Jacques, Laurent Duval, Caroline Chaux, Gabriel Peyré,
The richness of natural images makes the quest for optimal representations in image processing and computer vision challenging. The latter observation has not prevented the design of image representations, which trade off between efficiency and complexity, while achieving accurate rendering of smooth regions as well as reproducing faithful contours and textures. The most recent ones, proposed in the past decade, share a hybrid heritage highlighting the multiscale and oriented nature of edges and patterns in images. This paper presents a panorama of the aforementioned literature on decompositions in multiscale, multi-orientation bases or dictionaries. They typically exhibit redundancy to improve sparsity in the transformed domain and sometimes its invariance with respect to simple geometric deformations (translation, rotation). Oriented multiscale dictionaries extend traditional wavelet processing and may offer rotation invariance. Highly redundant dictionaries require specific algorithms to simplify the search for an efficient (sparse) representation. We also discuss the extension of multiscale geometric decompositions to non-Euclidean domains such as the sphere or arbitrary meshed surfaces. The etymology of panorama suggests an overview, based on a choice of partially overlapping “pictures”. We hope that this paper will contribute to the appreciation and apprehension of a stream of current research directions in image understanding.

No comments:

Printfriendly