Today, I will probably be at the Challenge in Optimization for Data Science workshop.
Since the last Nuit Blanche in Review ( May 2015 ) we certainly got closer to Pluto and Charon. We covered a few subjects ranging from 3rd generation genome sequencing (1), to hardware for machine learning (2) to a long series of post on random projections (3-7) and more:
- Compressive phase transition, a million genomes and 3rd generation sequencing
- Hardware for Machine Learning
- Slides: Learning with Random Projections, Random Projections for Machine Learning and Data Mining: Theory & Applications
- Random Projections as Regularizers, Compressive Linear Least Squares Regression and more
The Unreasonable Effectiveness of Random Projections in Computer Science - Improved Bounds on the Dot Product under Random Projection and Random Sign Projection
Random Features and random projections - Random Maxout Features look like Quantized JL Embeddings
- Extreme Compressive Sampling for Covariance Estimation
- Sunday Morning Insight: All hard problems are just slow feedback loops in disguise.
- Sunday Morning Insight: How Data Science Is.
and quite a few implementations:
- Slides and Implementation: Random forests with random projections of the output space for high dimensional multi-label classification
- Stable Autoencoding: A Flexible Framework for Regularized Low-Rank Matrix Estimation - implementation
- Towards Large Scale Continuous EDA: A Random Matrix Theory Perspective - implementation -
- Raking the Cocktail Party - implementation -
- Simultaneous Orthogonal Matching Pursuit With Noise Stabilization: Theoretical Analysis - implementation -
MaCBetH : Matrix Completion from Fewer Entries: Spectral Detectability and Rank Estimation - implementation - - Riemannian preconditioning for tensor and matrix completion - implementation(s) -
Modulated Unit-Norm Tight Frames for Compressed Sensing - implementation - - Randomer Forests - implementation -
- Sparse Proteomics Analysis - A compressed sensing-based approach for feature selection and classification of high-dimensional proteomics mass spectrometry data - implementation -
- Democratic Representations - implementation -
- Thesis: Learning in High Dimensions with Projected Linear Discriminants by Robert Durrant
- Thesis: A Randomized Proper Orthogonal Decomposition Method for Reducing Large Linear Systems Thesis: Listening to Distances and Hearing Shapes: Inverse Problems in Room Acoustics and Beyond
In-depth
- Inferring Graphs from Cascades: A Sparse Recovery Framework
- 3D imaging in volumetric scattering media using phase-space measurements
- Physics-driven inverse problems made tractable with cosparse regularization
Isometric sketching of any set via the Restricted Isometry Property - Taylor Polynomial Estimator for Estimating Frequency Moments
- Annihilating Filter based Low Rank Hankel Matrix Approach for Image Inpainting
- Mismatched Estimation in Large Linear Systems
1-bit and Quantized Compressive Sensing
Reader's comment: Long reads and the P-river, Hardware for Machine Learning and some implementation
ShapeFit: Exact location recovery from corrupted pairwise directions
CT Brush and CancerZap!: two video games for computed tomography dose minimization - Fast and Guaranteed Tensor Decomposition via Sketching
- Vowpal Wabbit: A Machine Learning System ( Paris Machine Learning Meetup "Hors Série" #4 Season 2 )
- Paris Machine Learning Meetup #10 Season 2 Finale: "And so it begins": Deep Learning, Recovering Robots, Vowpal and Hadoop, Predicsis, Matlab, Bayesian test, Experiments on #ComputationalComedy & A.I.
Videos:
Conferences:
- ICML 2015 papers are out
- Two Conferences: MMMA-2015, Moscow, August 2015 and CfP RSL-CV 2015, Santiago, Chile, Dec. 2015
- CfP: The 3rd International Workshop on High Dimensional Data Mining (HDM’15)
Videos:
- Saturday Morning Videos: Falling Back to Earth, D-Day
- Saturday Morning Videos: High Angle Take Offs
- Saturday Morning Videos: High Altitude Balloon with Android
No comments:
Post a Comment