My name is Igor Carron

## Page Views on Nuit Blanche since July 2010

My papers on ArXiv:
Approximating Kernels at the speed of Light
&
Imaging with Nature

|| Reddit

||
Attendant references pages:
The Advanced Matrix Factorization Jungle Page ||

Paris Machine Learning
@Meetup.com || @Archives

## Tuesday, May 26, 2015

### Randomized Robust Subspace Recovery for High Dimensional Data Matrices

I think this is a first ! A phase transition is found for a randomized algorithm. Welcome to the new mapmakers. A map to be shortly added to the Advanced Matrix Factorization page

Randomized Robust Subspace Recovery for High Dimensional Data Matrices by Mostafa Rahmani, George Atia

Principal Component Analysis (PCA) is a fundamental mathematical tool with broad applicability in numerous scientific areas. In this paper, a randomized PCA approach that is robust to the presence of outliers and whose complexity is independent of the dimension of the given data matrix is proposed. The proposed approach is a two-step algorithm. First, the given data matrix is turned into a small random matrix. Second, the columns subspace of the low rank matrix is learned and the outlying columns are located. The low-dimensional geometry of the low rank matrix is exploited to substantially reduce the complexity of the algorithm. A small random subset of the columns of the given data matrix is selected, then the selected data is projected into a random low-dimensional subspace. The subspace learning algorithm works with this compressed small size data. Two ideas for robust subspace learning are proposed to work under different model assumptions. The first idea is based on the linear dependence between the columns of the low rank matrix, and the second idea is based on the independence between the columns subspace of the low rank matrix and the subspace of the outlying columns. The proposed subspace learning approach has a closed-form expression and the outlier detector is a simple subspace projection operation. We derive sufficient conditions for the proposed method to extract the true subspace and identify the outlying data. These conditions are less stringent than those for existing methods. In particular, a remarkable portion of the given data is allowed to be outlier data.

Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.