- Sunday Morning Insight: Randomization is not a dirty word
- Sunday Morning Insight: Sharp Phase Transitions in Machine Learning ?
you won't be surprised by the finding that phase transitions are indeed coming to Machine Learning. If you are, here is another stone to that story:
Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing by Justin Ziniel, Philip Schniter
For the problem of binary linear classification and feature selection, we propose algorithmic approaches to classifier design based on the generalized approximate message passing (GAMP) algorithm, recently proposed in the context of compressive sensing. Our work focuses on the regime where the number of features greatly exceeds the number of training examples, but where only a few features suffice for accurate classification. We show that sum-product GAMP can be used to (approximately) minimize the classification error rate and max-sum GAMP can be used to minimize a wide variety of regularized loss functions. Moreover, we show how a "turbo" extension to GAMP allows us to learn weight vectors that exhibit structured sparsity. Furthermore, we describe an expectation-maximization (EM)-based scheme to learn the associated model parameters online, as an alternative to cross-validation, and we show that GAMP's state evolution framework can be used to accurately predict the misclassification rate. Finally, we present a detailed numerical study to confirm the accuracy, speed, and flexibility afforded by our GAMP-based approaches to binary linear classification.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment