Interesting exchange on the Twitter between two members of the Paris Machine Learning Meetup group:
@syhw Awesome! Your idea “DL is kernel machines whose kernel we learn” is shared by others as well http://t.co/Y4Vsh4S3wx
— Andrei Bursuc (@abursuc) August 11, 2014
First there is the blog post of Gabriel Synnaeve on deep learning. It is at:
Second this paper that Andrei Bursuc mentioned that I should have mentioned earlier (the implementation will follow once it is published). The methods approximate the Gaussian kernel following what looks like (to me at least) something akin to the Fast Multipole Method, but then again instead of learning W we might even look into making them random as done in the Random Kitchen Sinks approach. In the meantime, let's enjoy this one:
Convolutional Kernel Networks by Julien Mairal, Piotr Koniusz, Zaid Harchaoui, Cordelia Schmid
An important goal in visual recognition is to devise image representations that are invariant to particular transformations. In this paper, we address this goal with a new type of convolutional neural network (CNN) whose invariance is encoded by a reproducing kernel. Unlike traditional approaches where neural networks are learned either to represent data or for solving a classification task, our network learns to approximate the kernel feature map on training data. Such an approach enjoys several benefits over classical ones. First, by teaching CNNs to be invariant, we obtain simple network architectures that achieve a similar accuracy to more complex ones, while being easy to train and robust to overfitting. Second, we bridge a gap between the neural network literature and kernels, which are natural tools to model invariance. We evaluate our methodology on visual recognition tasks where CNNs have proven to perform well, e.g., digit recognition with the MNIST dataset, and the more challenging CIFAR-10 and STL-10 datasets, where our accuracy is competitive with the state of the art.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.