Seven papers that were recently featured here on Nuit Blanche have also been submitted to ICLR 2015 (they are all under the ICLR2015

**tag).**- Improving approximate RPCA with a k-sparsity prior
- Why does Deep Learning work? - A perspective from Group Theory
- Compressing Deep Convolutional Networks using Vector Quantization
- Unsupervised Learning of Spatiotemporally Coherent Metrics
- A la Carte - Learning Fast Kernels
- On the Stability of Deep Networks
- Deep Fried Convnets

Others are listed here.

The papers for ICLR 2015 are now open for discussion!and from here:

(this requires an ICLR 2015 CMT account, which you can sign up for if you don't have one already...)

The open commenting period for ICLR 2015 has begun.

This year we will be using CMT's public commenting mechanism, so to participate you must log in to the ICLR 2015 CMT site:

https://cmt.research.microsoft.com/ICLR2015/Protected/PublicComment.aspx

If you don't already have a CMT account, please request one.

The list of submitted papers, with links to arXiv, is available at

http://www.iclr.cc/doku.php?id=iclr2015:main

Please remember that comments will be visible to anyone who logs in to CMT. We intend to make the comments available on openreview.net once they finish a major revision of their infrastructure. The anonymous reviews will be visible on CMT at the beginning of the discussion period, and will also eventually be made available on openreview.net.

If you want to know what is meant by open commenting, please see this description of the ICLR reviewing model: http://www.iclr.cc/doku.php?id=pubmodel

If you have any questions, please send them to iclr2015.programchairs@gmail.com

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment