A while back, the study was made comparing random features to scattering networks features for hyperspectral imagery. This time, the authors look at the difference between using LIBSVM and GURLS (with an implementation of random features) when performing classification in that field.
GURLS vs LIBSVM: Performance Comparison of Kernel Methods for Hyperspectral Image Classification by Nikhila Haridas , V. Sowmya , K. P. Soman
Kernel based methods have emerged as one of the most promising techniques for Hyper Spectral Image classification and has attracted extensive research efforts in recent years. This paper introduces a new kernel based framework for Hyper Spectral Image (HSI) classification using Grand Unified Regularized Least Squares (GURLS) library. The proposed work compares the performance of different kernel methods available in GURLS package with the library for Support Vector Machines namely, LIBSVM. The assessment is based on HSI classification accuracy measures and computation time. The experiment is performed on two standard Hyper Spectral datasets namely, Salinas A and Indian Pines subset captured by AVIRIS (Airborne Visible Infrared Imaging Spectrometer) sensor. From the analysis, it is observed that GURLS library is competitive to LIBSVM in terms of its prediction accuracy whereas computation time seems to favor LIBSVM. The major advantage of GURLS toolbox over LIBSVM is its simplicity, ease of use, automatic parameter selection and fast training and tuning of multi-class classifier. Moreover, GURLS package is provided with an implementation of Random Kitchen Sink algorithm, which can easily handle high dimensional Hyper Spectral Images at much lower computational cost than LIBSVM.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment