Pages

Thursday, May 12, 2016

PhD Thesis: Sparse and low rank signal recovery with partial knowledge, Jinchun Zhan


Sparse and low rank signal recovery with partial knowledge by Jinchun Zhan 

In the first part of this work, we study sparse recovery problem in the presence of bounded noise. We obtain performance guarantees for modified-CS and for its improved version, modified-CS-Add-LS-Del, for recursive reconstruction of a time sequence of sparse signals from a reduced set of noisy measurements available at each time. Under mild assumptions, we show that the support recovery error and reconstruction error of both algorithms are bounded by a time-invariant and small value at all times.
In the second part of this work, we study batch sparse recovery problem in the presence of large and low rank noise, which is also known as the problem of Robust Principal Components Analysis (RPCA). In recent work, RPCA has been posed as a problem of recovering a low-rank matrix $\Lb$ and a sparse matrix $\Sb$ from their sum, $\M:= \Lb + \Sb$ and a provably exact convex optimization solution called PCP has been proposed. We study the following problem. Assume that we have a partial estimate of the column space of the low rank matrix $\Lb$, we propose here a simple but useful modification of the PCP idea, called modified-PCP, that allows us to use this knowledge. We derive its correctness result which shows that modified-PCP indeed requires significantly weaker incoherence assumptions than PCP, when the available subspace knowledge is accurate.
In the third part of this work, we study the ``online" sparse recovery problem in the presence of low rank noise and bounded noise, which is also known as the ``online" RPCA problem. Here we study a more general version of this problem, where the goal is to recover low rank matrix $\Lb$ and sparse matrix $\Sb$ from $\M:=\Lb + \Sb + \W$ and $\W$ is the matrix of unstructured small noise. We develop and study a novel ``online" RPCA algorithm based on the recently introduced Recursive Projected Compressive Sensing (ReProCS) framework. The key contribution is a correctness result for this algorithm under relatively mild assumptions.

 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment