Interesting ! Sketching kernels:
Randomized sketches for kernels: Fast and optimal non-parametric regression by Yun Yang, Mert Pilanci, Martin J. Wainwright
Kernel ridge regression (KRR) is a standard method for performing non-parametric regression over reproducing kernel Hilbert spaces. Givenn samples, the time and space complexity of computing the KRR estimate scale asO(n3) andO(n2) respectively, and so is prohibitive in many cases. We propose approximations of KRR based onm -dimensional randomized sketches of the kernel matrix, and study how small the projection dimensionm can be chosen while still preserving minimax optimality of the approximate KRR estimate. For various classes of randomized sketches, including those based on Gaussian and randomized Hadamard matrices, we prove that it suffices to choose the sketch dimensionm proportional to the statistical dimension (modulo logarithmic factors). Thus, we obtain fast and minimax optimal approximations to the KRR estimate for non-parametric regression.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment