Wednesday, July 01, 2015

Nuit Blanche in Review ( June 2015 )

Today, I will probably be at the Challenge in Optimization for Data Science workshop.

Since the last Nuit Blanche in Review ( May 2015 ) we certainly got closer to Pluto and Charon. We covered a few subjects ranging from 3rd generation genome sequencing (1), to hardware for machine learning (2) to a long series of post on random projections (3-7) and more:

  1. Compressive phase transition, a million genomes and 3rd generation sequencing 
  2.  Hardware for Machine Learning
  3.  Slides: Learning with Random Projections, Random Projections for Machine Learning and Data Mining: Theory & Applications   
  4. Random Projections as Regularizers, Compressive Linear Least Squares Regression and more
    The Unreasonable Effectiveness of Random Projections in Computer Science 
  5. Improved Bounds on the Dot Product under Random Projection and Random Sign Projection
    Random Features and random projections
  6. Random Maxout Features look like Quantized JL Embeddings 
  7. Extreme Compressive Sampling for Covariance Estimation 
from these, we shared two insights:

and quite a few implementations:
Some theses:






Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments: