So this past month, since the last Nuit Blanche in Review (February 2017), we asked ourselves "How can you tell the world is changing right before your eyes ?", passed six million page views on Nuit Blanche, advertized a job in Machine Learning at LightOn (as well as many others see the job section), saw one potential impact of AI in the future (Warming Up ) and featured the first Compressed Sensing preprint that uses Generative Models. We also featured several announcements (Summer schools), had two Paris Machine Learning meetups, and had several videos. Enjoy !
Sunday Morning Insight
Implementations
- Implementation: Compressed Sensing using Generative Models
- Evolution Strategies as a Scalable Alternative to Reinforcement Learning - implementation -
- Evolution Strategies as a Scalable Alternative to Reinforcement Learning - implementation - part 2
- Preconditioned Data Sparsification for Big Data with Applications to PCA and K-means - implementation -
- Sparsebn - A new R package for learning sparse graphical models from high-dimensional data via sparse regularization - implementation -
- Insense -implementation -
- Sparsity/Undersampling Tradeoffs in Anisotropic Undersampling, with Applications in MR Imaging/Spectroscopy - implementation -
The blogs;
In-depth:
- A Perspective on Deep Imaging
- Learning in the Machine: Random Backpropagation and the Learning Channel
- Random Features for Compositional Kernels
- Scaling the Scattering Transform: Deep Hybrid Networks - implementation - / The Shattered Gradients Problem: If resnets are the answer, then what is the question?
- Making Backpropagation Plausible
- Random Triggering Based Sub-Nyquist Sampling System for Sparse Multiband Signal
- The Unreasonable Effectiveness of Random Orthogonal Embeddings
- k-NN at Scale and Deep Learning
- Compressed Sensing using Generative Models
- Solution of linear ill-posed problems using random dictionaries
- Least Squares Generative Adversarial Networks
- Deep Semi-Random Features for Nonlinear Function Approximation
- McGan: Mean and Covariance Feature Matching
Job
- Job: Machine Learning, LightOn, Paris.
- CSjob: Three Postdocs, SAMP-Lab, Technion
- CSjobs: Postdocs, Signal and Image Processing Institute, University of Southern California
- CSjob; Internship (Spring/Summer/Fall 2017), IFP Energies nouvelles, France
- Jobs: Two Senior Research Scientist/Principal Research Scientist, NPL, Teddington, U.K.
- Job: Research Associate in Deep Learning, NRL, Washington D.C.
- Ce soir: Paris Machine Learning #7 Season 4 @ Algolia, deep reinforcement learning, feature engineering
- Paris Machine Learning Hors Serie #10 : Workshop SPARK (atelier 1)
- The Great Convergence goes mainstream: CfP, "Machine Learning for Image Reconstruction" in IEEE Trans. on Medical Imaging.
- Data Science Summer School, Ecole Polytechnique, France, August 28th- September 1st, 2017
- Summer School "Structured Regularization for High-Dimensional Data Analysis" - IHP Paris - June 19th to 22nd
- Emmanuel Candès in Paris, Insense, Imaging M87, gauge and perspective duality, Structured signal recovery from quadratic measurements
Videos:
Other:
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
- Saturday Morning Video: Deep Learning And The Future Of AI, Yann LeCun at Tsinghua University
- Saturday Morning Videos: Representation Learning Workshop at Simons Institute, Berkeley (March 27th-31st, 2017)
- Saturday Morning Video: #NIPS2016 Symposium, Recurrent Neural Networks and Other Machines that Learn Algorithms
- Saturday Morning Video: The Role of Multi-Agent Learning in Artificial Intelligence Research at DeepMind
Other:
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
1 comment:
I made some comments related to compressive sensing on this youtube video:
https://youtu.be/ceD736_Fknc
I guess it would be better to start with a zero weighted random projection net and set the outputs of the reality manifold net to all 1's on real samples during training, rather than starting with a randomly weighted net and learning zeros. The reason is to avoid catastrophic forgetting and better allowing lossy compression of the reality manifold. Well that is just an intuition based on experience with such nets.
The idea then is during decompressing you use the reality manifold net to guide the result toward the nearest point on the manifold.
I hope that makes some sort of sense!!
Post a Comment