Pages

Thursday, January 08, 2015

Approximate Subspace-Sparse Recovery in the Presence of Corruptions via $\ell_1$-Minimization

Having featured the work of Ehsan before, it is just a question of time before we get to have a peak at an implementation of this. I can't wait.



Approximate Subspace-Sparse Recovery in the Presence of Corruptions via $\ell_1$-Minimization by Ehsan Elhamifar, Mahdi Soltanolkotabi, Shankar Sastry

High-dimensional data often lie in low-dimensional subspaces corresponding to different classes they belong to. Finding sparse representations of data points in a dictionary built from the collection of data helps to uncover the low-dimensional subspaces and, as a result, address important problems such as compression, clustering, classification, subset selection and more. However, an important challenge related to real-world datasets is that the collection of data is often corrupted by measurement or process noise. In this paper, we address the problem of recovering sparse representations for noisy data points in a dictionary whose columns correspond to noisy data points lying close to a union of subspaces. We consider a constrained $\ell_1$-minimization program and study conditions under which the solution of the optimization recovers a representation of a noisy point as a linear combination of a few noisy points from the same subspace. Our framework is based on a novel generalization of the null-space property to the setting where data lie in multiple subspaces, the number of data points in each subspace exceeds the dimension of the subspace, and all data points are corrupted by noise. We do not impose any randomness assumption on the arrangement of subspaces or distribution of data points in each subspace. We show that, under appropriate conditions that depend on relationships among data points within and between subspaces, the solution of our proposed optimization satisfies a desired approximate subspace-sparse recovery. More specifically, we show that a noisy data point, close to one of the subspaces, will be reconstructed using data points from the same subspace with a small error and that the coefficients corresponding to data points in other subspaces will be sufficiently small.
 
 
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment