Friday, January 18, 2013

Fast Food: Approximating Kernel Expansion in Loglinear Time

You probably recall Fast Functions via Randomized Algorithms: Fastfood versus Random Kitchen Sinks, well  Ana at VideoLectures.net just informed me that the video of Alex Smola´s presentation is now available from the NIPS video site. We may have to wait for the presentation 9some of it is here) though. In the meantime, but here it is:

The ability to evaluate nonlinear function classes rapidly is crucial for nonparametric estimation. We propose an improvement to random kitchen sinks that offers O(n log d) computation and O(n) storage for n basis functions in d dimensions without sacrificing accuracy. We show how one may adjust the regularization properties of the kernel simply by changing the spectral distribution of the projection matrix. Experiments show that we achieve identical accuracy to full kernel expansions and random kitchen sinks 100x faster with 1000x less memory.

2 comments:

vkamath said...

What paper is this talk based on? I don't seem to be able to find much about it.

Igor said...

If you follow the link to the previous entry (http://nuit-blanche.blogspot.com/2012/11/fast-functions-via-randomized.html) , you'll notice "Fast Food...2013, submitted". As far as I can tell there is no preprint yet. If you do find something, please let me know and I will update accordingly.

Igor.

Printfriendly