Pages

Thursday, November 05, 2015

Sparse Multinomial Logistic Regression via Approximate Message Passing

Here is Evan Byrne's Master's thesis entitled "Sparse Multinomial Logistic Regression via Approximate Message Passing,"and the attendant arxiv preprint and recent poster. Congratulations Evan !



Sparse Multinomial Logistic Regression via Approximate Message Passing by Evan Byrne, Philip Schniter

For the problem of multi-class linear classification and feature selection, we propose approximate message passing approaches to sparse multinomial logistic regression. First, we propose two algorithms based on the Hybrid Generalized Approximate Message Passing (HyGAMP) framework: one finds the maximum a posteriori (MAP) linear classifier and the other finds an approximation of the test-error-rate minimizing linear classifier. Then we design computationally simplified variants of these two algorithms. Next, we detail methods to tune the hyperparameters of their assumed statistical models using Stein's unbiased risk estimate (SURE) and expectation-maximization (EM), respectively. Finally, using both synthetic and real-world datasets, we demonstrate improved error-rate and runtime performance relative to state-of-the-art existing approaches.
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment