If there is one thing that changed the course of signal processing it happened, in my view, back in 1996 [1] with the realization that a mamalian primary visual cortex could perform sparse coding on natural scene images. In other words, much of the intellectual work has since tried to stick to this element of sparsity and compressibility as a way to be as close a possible to what Nature does. Compressive Sensing is one such outcome. But deep down, given new signals, we never really know if these issues of sparsity are really important or very important. There is also the issue of structured sparsity that is much less well defined. In all, one wonders, if there is a way to pick datasets and then figure out their deeper structure and then, maybe as consequence, build new sensing framework around those structures.
If you recall some of the ideas expressed in this Sunday Morning Insight entry on Matrix Factorizations and the Grammar of Life, you will like the next items as it pertains to this path for extracting structures from datasets. Some of the authors featured in the Sunday Morning Insight entry have come up with a new paper entitled: Structure Discovery in Nonparametric Regression through Compositional Kernel Search by David Duvenaud, James Robert Lloyd, Roger Grosse, Joshua B. Tenenbaum, Zoubin Ghahramani. The abstract reads:
If you recall some of the ideas expressed in this Sunday Morning Insight entry on Matrix Factorizations and the Grammar of Life, you will like the next items as it pertains to this path for extracting structures from datasets. Some of the authors featured in the Sunday Morning Insight entry have come up with a new paper entitled: Structure Discovery in Nonparametric Regression through Compositional Kernel Search by David Duvenaud, James Robert Lloyd, Roger Grosse, Joshua B. Tenenbaum, Zoubin Ghahramani. The abstract reads:
Despite its importance, choosing the structural form of the kernel in nonparametric regression remains a black art. We define a space of kernel structures which are built compositionally by adding and multiplying a small number of base kernels. We present a method for searching over this space of structures which mirrors the scientific discovery process. The learned structures can often decompose functions into interpretable components and enable long-range extrapolation on time-series datasets. Our structure search method outperforms many widely used kernels and kernel combination methods on a variety of prediction tasks.
You may also like an attendant poster and presentation Automated Structure Discovery in Nonparametric Regression through Compositional Grammars and a related webpage The Kernel Cookbook: Advice on Covariance functions all by David Duvenaud. The most important part of this paper and attendant document is the availability of an implementation on Github at:
Also from David Duvenaud's series of talks:
Introduction to Probabilistic Programming and Automated InferenceComputational and Biological Learning Lab, University of Cambridge, March 2013 |
Meta-reasoning and Bounded RationalityTea talk, Feb 2013 |
Of related interest for background: Gaussian Processes for Machine Learning by Carl Edward Rasmussen and Christopher K. I. Williams
[1] Emergence of Simple-Cell Receptive Field Properties by Learning a Sparse Code for Natural Images, Olshausen BA, Field DJ (1996). Nature, 381: 607-609. reprint (pdf) | abstract
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
1 comment:
This is very exciting research, as also is the related paper "Exploiting compositionality to explore a large space of model structures".
Of interest is that this paper notes the relationship between itself and the related paper by saying "Grosse et al. (2012) performed a greedy search over a
compositional model class for unsupervised learning,
using a grammar and a search procedure which parallel
our own. This model class contained a large number
of existing unsupervised models as special cases and
was able to discover such structure automatically from
data. Our work is tackling a similar problem, but in a
supervised setting."
Post a Comment