Pages

Thursday, October 06, 2016

The Famine of Forte: Few Search Problems Greatly Favor Your Algorithm

From the conclusion of the paper:
The search framework we propose is general enough to be applied to many problem areas, such as machine learning, evolutionary search, security and hyperparameter optimization. The results are not just of theoretical importance, but help explain real-world phenomena, such as the need for exploitable dependence in machine learning and the empirical diffi culty of automated learning [12]. Our results help us understand the growing popularity of deep learning methods and unavoidable interest in automated hyperparameter tuning methods. Extending the framework to continuous settings and other problem areas (such as active learning and regression) is the focus of ongoing research.
 
 
 
The Famine of Forte: Few Search Problems Greatly Favor Your Algorithm by George D. Montanez

No Free Lunch theorems show that the average performance across any closed-under-permutation set of problems is fixed for all algorithms, under appropriate conditions. Extending these results, we demonstrate that the proportion of favorable problems is itself strictly bounded, such that no single algorithm can perform well over a large fraction of possible problems. Our results explain why we must either continue to develop new learning methods year after year or move towards highly parameterized models that are both flexible and sensitive to their hyperparameters.
 
 
 
 
 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment