Wednesday, April 02, 2014

KNIFE: Automatic Feature Selection via Weighted Kernels and Regularization - implementation -

One wonders if there would be a way to randomize some part of this algorithm:




Selecting important features in non-linear kernel spaces is a difficult challenge inboth classification and regression problems. We propose to achieve feature selectionby optimizing a simple criterion: a feature-regularized loss function. Features withinthe kernel are weighted, and a lasso penalty is placed on these weights to encouragesparsity. We minimize this feature-regularized loss function by estimating the weightsin conjunction with the coefficients of the original classification or regression problem,thereby automatically procuring a subset of important features. Our algorithm, KerNel Iterative Feature Extraction (KNIFE), is applicable to a wide variety of kernelsand high-dimensional kernel problems. In addition, a modification of KNIFE gives acomputationally attractive method for graphically depicting non-linear relationshipsbetween features by estimating their feature weights over a range of regularizationparameters. We demonstrate the utility of KNIFE in selecting features through simulations and examples for both kernel regression and support vector machines. Feature path realizations also give graphical representations of important features and the nonlinear relationships among variables.
The attendant implementation is here.



Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly