If you recall the Random Kitchen Sinks, you'll remember that one chooses randomly some of the coefficient and then evaluate the remaining coefficient with a least square approach. Those functionals using random numbers are part of what is called a (nonlinear) dictionary. Here is the state of the art with regards to approximating time series with the RHKSs with nonrandom "frequencies". Maybe one should look into random dictionaries:

Kernel LMS algorithm with forward-backward splitting for dictionary learning. by Gao, Wei, Chen, Jie, Richard, Cédric, Huang, Jianguo, and Flamary, Rémi . A related tutorial can be found here.

Nonlinear adaptive ﬁltering with kernels has become a topic of high interest over the last decade. A characteristics of kernel-based techNonlinear adaptive ﬁltering with kernels has become a topic of high interest over the last decade. A characteristics of kernel-based techter parameter update stage. It is surprising to note that most existing strategies for dictionary update can only incorporate new elements into the dictionary. This unfortunately means that they cannot discard obsolete kernel functions, within the context of a time-varying environment in particular. Recently, to remedy this drawback, it has been proposed to associate an ℓ1norm regularization criterion with the mean-square error criterion. The aim of this paper is to provide theoretical results on the convergence of this approach.

Matlab implementation is Cedric's page

Of related interest:

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

- Detection of nonlinear mixtures using Gaussian processes: Application to hyperspectral imaging. by Imbirida, Tales, Bermudez, Jose-Carlos M., Tourneret, Jean-Yves, and Richard, Cédric
- Online dictionary learning for kernel LMS. Analysis and forward-backward splitting algorithm. by Gao, Wei, Chen, Jie, Richard, Cédric, and Huang, Jianguo

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment