A low-complexity recursive procedure is presented for model selection and minimum mean squared error (MMSE) estimation in linear regression. Emphasis is given to the case of a sparse parameter vector and fewer observations than unknown parameters. A Gaussian mixture is chosen as the prior on the unknown parameter vector. The algorithm returns both a set of high posterior probability mixing parameters and an approximate MMSE estimate of the parameter vector. Exact ratios of posterior probabilities serve to reveal potential ambiguity among multiple candidate solutions that are ambiguous due to observation noise or correlation among columns in the regressor matrix. Algorithm complexity is linear in the number of unknown coefficients, the number of observations and the number of nonzero coefficients. If hyperparameters are unknown, a maximum likelihood estimate is found by a generalized expectation maximization algorithm. Numerical simulations demonstrate estimation performance and illustrate the distinctions between MMSE estimation and maximum a posteriori probability model selection.The matlab implementation of the Fast Bayesian Matching Pursuit code used in the paper will be featured here shortly according to the site.
Also, Simon Foucart mentions to me that he has updated his paper entitled Sparsest solutions of underdetermined linear systems via -minimization for . by himself and Ming-Jun Lai. As mentioned before, the code is here.
Since the beginning of this week, we have had several addtion on the reconstruction side of things, I have updated the list of reconstruction codes in the big picture accordingly.
Finally, here is Ike's prediction as of yesterday and how it relates to both Texas A&M University and Rice University. Better information can be found here.
Credit: NASA/JPL/University of Arizona, Hirise camera image of Mars, Ius Chasma's Floor (PSP_009368_1720)
No comments:
Post a Comment