Pages

Saturday, May 31, 2014

Nuit Blanche in Review (May 2014)

Since the last Nuit Blanche in Review (April 2014), Nuit Blanche crossed the 1 million visits (and it is about to get close to 3 million page views). Yesterday's entry on the slides of the Sublinear Algorithms 2014 meeting still resonates with me, but this past month, I also created a meetup around note by note cooking ( Connections between Note by Note Cooking and Machine Learning ). The Advanced Matrix Factorization Jungle page added some factorizations found in Graph Matching and Archetypal Analysis. We also wondered how FROG could be addressed with Nonlinear Compressive Sensing, saw the growth of Multiple Regularizers,  but also saw the possibility of one-bit compressive sensing to provide some way for understanding neural networks or how real Neurons could be thought as a Signal Processing Device. Sharp phase transition such as Donoho-Tanner was also seen as a way to probe good from bad neural networks in Compressive Sensing and Short Term Memory / Visual Nonclassical Receptive Field Effects or provide Sharp Performance Bounds for Graph Clustering . We were also made aware of the connection how the theory of convex optimization influences Machine Learning or the geometric perspective of the problem of the estimation in high dimensions, we also had an overview of Sparsity-Aware Learning and Compressed Sensing

We also had quite a few implementations made available.

Implementations: 
Sunday Morning Insights:
Experiment /Hardware
Focused Entries:

Meetings / Meetups :
Q&A:

Video
Comics
Jobs:

Hamming's time

Saturday Morning Videos

Credit: ESA/NASA, Soho, LASCO 2


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment