Random Kitchen Sinks and manifolds, this is interesting. From the paper:
To address the above drawbacks, in this paper we propose to perform the sparse coding of SPD matrices by embedding Riemannian manifolds into reproducing kernel Hilbert spaces (RKHS) [13]. This is in contrast to directly embedding into Euclidean spaces [7,6,14].
Sparse Coding and Dictionary Learning for Symmetric Positive Definite Matrices: A Kernel Approach by Mehrtash T. Harandi, Conrad Sanderson, Richard Hartley, Brian C. Lovell
Recent advances suggest that a wide range of computer vision problems can be addressed more appropriately by considering non-Euclidean geometry. This paper tackles the problem of sparse coding and dictionary learning in the space of symmetric positive definite matrices, which form a Riemannian manifold. With the aid of the recently introduced Stein kernel (related to a symmetric version of Bregman matrix divergence), we propose to perform sparse coding by embedding Riemannian manifolds into reproducing kernel Hilbert spaces. This leads to a convex and kernel version of the Lasso problem, which can be solved efficiently. We furthermore propose an algorithm for learning a Riemannian dictionary (used for sparse coding), closely tied to the Stein kernel. Experiments on several classification tasks (face recognition, texture classification, person re-identification) show that the proposed sparse coding approach achieves notable improvements in discrimination accuracy, in comparison to state-of-the-art methods such as tensor sparse coding, Riemannian locality preserving projection, and symmetry-driven accumulation of local features.The implementation is here.
No comments:
Post a Comment