Pages

Monday, August 20, 2018

SPORCO: Convolutional Dictionary Learning - implementation -



Brendt sent me the following a few days ago: 

Hi Igor,
We have two new papers on convolutional dictionary learning as well as some recent related code. Could you please post an announcement on Nuit Blanche?
Brendt
Sure Brendt ! It is already mentioned in the Advanced Matrix Factorization Jungle Page as this is an awesome update to the previous announcement.



"Convolutional Dictionary Learning: A Comparative Review and New Algorithms", available from http://dx.doi.org/10.1109/TCI.2018.2840334 and https://arxiv.org/abs/1709.02893, reviews existing batch-mode convolutional dictionary learning algorithms and proposes some new ones with significantly improved performance. Implementations of all of the most competitive algorithms are included in the Python version of the SPORCO library at https://github.com/bwohlberg/sporco .

"First and Second Order Methods for Online Convolutional Dictionary Learning", available from http://dx.doi.org/10.1137/17M1145689 and https://arxiv.org/abs/1709.00106, extends our previous work and proposes some new algorithms for online convolutional dictionary learning that we believe outperform existing alternatives. Implementations of all of the new algorithms are included in the
Matlab version of the SPORCO library at http://purl.org/brendt/software/sporco and the first order algorithm is also included in the Python version of the SPORCO library at https://github.com/bwohlberg/sporco . A very recent addition to the Python version is the ability to exploit the SPORCO-CUDA extension to greatly accelerate the learning process.



Convolutional sparse representations are a form of sparse representation with a dictionary that has a structure that is equivalent to convolution with a set of linear filters. While effective algorithms have recently been developed for the convolutional sparse coding problem, the corresponding dictionary learning problem is substantially more challenging. Furthermore, although a number of different approaches have been proposed, the absence of thorough comparisons between them makes it difficult to determine which of them represents the current state of the art. The present work both addresses this deficiency and proposes some new approaches that outperform existing ones in certain contexts. A thorough set of performance comparisons indicates a very wide range of performance differences among the existing and proposed methods, and clearly identifies those that are the most effective.


Convolutional sparse representations are a form of sparse representation with a structured, translation invariant dictionary. Most convolutional dictionary learning algorithms to date operate in batch mode, requiring simultaneous access to all training images during the learning process, which results in very high memory usage and severely limits the training data that can be used. Very recently, however, a number of authors have considered the design of online convolutional dictionary learning algorithms that offer far better scaling of memory and computational cost with training set size than batch methods. This paper extends our prior work, improving a number of aspects of our previous algorithm; proposing an entirely new one, with better performance, and that supports the inclusion of a spatial mask for learning from incomplete data; and providing a rigorous theoretical analysis of these methods.


Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment