Tuesday, July 09, 2013

The Summer of the Deeper Kernels

In the ROKS2013 Extended abstracts list and the SAHD Program Agenda, there are several presentations that mention kernels as a means of learning or performing nonlinear compressive sensing. At SPARS, we have Kernel Compressive Sensing by Farhad Pourkamali Anaraki. I also note that, at ROKS, the deeper learning trend is coming to Kernels as well. See for instance:  
A small note on yesterday's note: Let me just say what I pointed out back in 2007 ( Compressed Sensing in the Primary Visual Cortex ?), one could read specifically in the supplemental to A feedforward architecture accounts for rapid categorization by Thomas SerreAude Oliva, and Tomaso Poggio that: 

"..simple units receive only a subset of the possible afferent units (selected at random) such that nSk < NSk × NSk..."

We establish a link between Fourier optics and a recent construction from the machine learning community termed the kernel mean map. Using the Fraunhofer approximation, it identifies the kernel with the squared Fourier transform of the aperture. This allows us to use results about the invertibility of the kernel mean map to provide a statement about the invertibility of Fraunhofer diffraction, showing that imaging processes with arbitrarily small apertures can in principle be invertible, i.e., do not lose information, provided the objects to be imaged satisfy a generic condition. A real world experiment shows that we can super-resolve beyond the Rayleigh limit.
From note 5:
5 This provides a physical interpretation of the kernel as the point response of an optical system. This kind of interpretation can be bene cial also for other systems, and indeed it is suggested by the view of kernels as Green's functions [16, 24]: the kernel k can be viewed as the Green's function of P P, where P is a regularization operator such that the RKHS norm can be written as kfkk = kP fk. For instance, the Gaussian kernel corresponds to a regularization operator which computes an infinite series of derivatives of f.

Video: Kernel Tricks, Means and Ends by Bernhard Schölkopf


I can't wait to see how Random Kitchen Sinks could fit in these deeper architectures and how they could map to physical systems.


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly