Saturday, May 14, 2016

Saturday Morning Video: Algorithmic Aspects of Inference, Ankur Moitra @CIRM

A week-long school was held at the Centre International de Recontres Mathématiques (CIRM) in Marseille, France. It immediately preceded the IHP Thematic Program in Paris, which made a concerted effort to broaden and deepen the connections between information theory and the theory of computation. The school consisted of several tutorials, each taught by a leading researcher, with the goal of introducing the key questions, mathematical tools, and open problems in an area.

Here is the first video.

Algorithmic Aspects of Inference   Ankur Moitra (MIT)

Parametric inference is one of the cornerstones of statistics, but much of the classic theory revolves around asymptotic notions of convergence and relies on estimators that are hard to compute (particularly in high-dimensional problems). In this tutorial, we will explore the following questions:
(1) For some of the fundamental problems in statistics, are there surrogates for the maximum likelihood estimator that also converge at an inverse polynomial rate to the true parameters, but in contrast can be computed efficiently?
(2) Can we establish tradeoffs between sample complexity and computational complexity? And what types of hardness assumptions allow us to explore this space?
We will cover topics such as the method of moments, learning mixture models, tensor decomposition, sparse PCA and matrix/tensor completion.

 Algorithmic Aspects of Inference


Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

1 comment:

Royi said...

What a great teacher.
Great Lectures.

Thank You.

Printfriendly