Xianbiao Shu just sent me the following:
Hi IgorI am Xianbiao, currectly working at Qualcomm after graduation from University of Illinois at Urbana-Champaign (UIUC).Recently, I have two interesting papers:(1) "Non-Local Compressive Sampling Recovery": which can significantly reduces the sampling rate by using non-local patch correlation.(2) "Robust Orthonormal Subspace Learning: Efficient Recovery of Corrupted Low-rank Matrices": which can recover the low-rank matrices at linear complexity.Would you like do me a favor by posting them on Nuit Blanche?The detailed paper information and code are shared on
Thanks a lot.Best regards,
Sure Xianbiao !, Here is the first one:
Robust Orthonormal Subspace Learning: Efficient Recovery of Corrupted Low-rank Matrices by Xianbiao Shu, Fatih Porikli, Narendra Ahuja
Low-rank matrix recovery from a corrupted observation has many applications in computer vision. Conventional methods address this problem by iterating between nuclear norm minimization and sparsity minimization. However, iterative nuclear norm minimization is computationally prohibitive for large-scale data (e.g., video) analysis. In this paper, we propose a Robust Orthogonal Subspace Learning (ROSL) method to achieve efﬁcient low-rank recovery. Our intuition is a novel rank measure on the low-rank matrix that imposes the group sparsity of its coefﬁcients under orthonormal subspace. We present an efﬁcient sparse coding algorithm to minimize this rank measure and recover the low-rank matrix at quadratic complexity of the matrix size. We give theoretical proof to validate that this rank measure is lower bounded by nuclear norm and it has the same global minimum as the latter. To further accelerate ROSL to linear complexity, we also describe a faster version (ROSL+) empowered by random sampling. Our extensive experiments demonstrate that both ROSL and ROSL+ provide superior efﬁciency against the state-of-the-art methods at the same level of recovery accuracy.
The attendant code is on Xianbiao Shu's page.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.