As I am stateside and a little short on time and getting things out, but you readers are helping out, so thank you. Case in point, Derin Babacan just sent me the following:
....I just wanted to let you know what we released two codes that you and your readers might be interested in.The first one is for the paper "Sparse Bayesian Methods for Low-Rank Matrix Estimation" (https://netfiles.
uiuc.edu/dbabacan/www/papers/ VBLRMat.pdf), which is on recovering underlying low-rank matrices in matrix completion and robust PCA problems. The code is here: https://netfiles.uiuc. edu/dbabacan/www/software/ VBLRMat.zipThe nice thing about these methods is that they can automatically estimate all required algorithmic parameters including the rank of the unknown matrix, and especially the robust PCA method is very fast.The second one is for the paper "Bayesian Group-Sparse Modeling and Variational Inference" (here: https://netfiles.uiuc. edu/dbabacan/www/papers/VBGS_ final.pdf ). This paper presents a number of Bayesian modeling possibilities suitable for both regular sparse and group-sparse modeling. The current version of the code (https://netfiles.uiuc.edu/ dbabacan/www/software/VBGS_ v02.zip ) is both very fast and highly accurate.Any feedback is very welcome. I had a very nice experience in sharing source-code in the past: I fixed many bugs, implemented various improvements, learned about new fields and problems, and even started new collaborations!By the way, other codes (related to older papers) for blind deconvolution, super resolution, compressive sensing are also posted on https://netfiles.uiuc.edu/ dbabacan/www/software/Thanks!Best,Derin
The paper are: Sparse Bayesian Methods for Low-Rank Matrix Estimation by S. Derin Babacan, Martin Luessi, Rafael Molina, Aggelos K. Katsaggelos. The abstract reads:
Recovery of low-rank matrices has recently seen significant activity in many areas of science and engineering, motivated by recent theoretical results for exact reconstruction guarantees and interesting practical applications. In this paper, we present novel recovery algorithms for estimating low-rank matrices in matrix completion and robust principal component analysis based on sparse Bayesian learning (SBL) principles. Starting from a matrix factorization formulation and enforcing the low-rank constraint in the estimates as a sparsity constraint, we develop an approach that is very effective in determining the correct rank while providing high recovery performance. We provide connections with existing methods in other similar problems and empirical results and comparisons with current state-of-the-art methods that illustrate the effectiveness of this approach.
Bayesian Group-Sparse Modeling and Variational Inference by S. Derin Babacan, Shinichi Nakajima, Minh N. Do. The abstract reads:
In this paper, we present a general class of multivariate priors for group-sparse modeling within the Bayesian framework. We show that special cases of this class correspond to multivariate versions of several classical priors used for sparse modeling. Hence, this general prior formulation is helpful in analyzing the properties of different modeling approaches and their connections. We derive the estimation procedures with these priors using variational inference for fully Bayesian estimation. In addition, we discuss the differences between the proposed inference and deterministic inference approaches with these priors. Finally, we show the flexibility of this modeling by considering several extensions such as multiple measurements, within-group correlations and overlapping groups.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.