Here are the videos of the
IMA /
AP Workshop: Information Theory and Concentration Phenomenathat took place on April 13-17, 2015
(IMA webmaster: please note that
this video is that of Andrea Montanari not Victoria Kostina)
Presentation of the workshop:
Concentration phenomena have come to play a significant role in
probability, statistics, and computer science. The purpose of this
workshop is to bring together applied and theoretical researchers to
stimulate further progress on concentration phenomena and its
connections with other areas. Ideas from information theory and geometry
can be used to establish new types of concentration inequalities, and
they provide a deeper understanding of established results. In
particular, geometric and entropic inequalities play an important role
in finite-dimensional and asymptotic concentration phenomena. At the
same time, concentration results can lead to a deeper understanding of
information theory and geometry. Although these principles are well
established, there have been many striking advances in recent years.
Researchers have made significant progress on subadditivity of quantum
information, quantitative entropy power inequalities, and concentration
properties for random variables that have many symmetries (such as spin
glasses and random graph models). Another line of work uses tools from
quantum statistical mechanics to analyze the random matrices that arise
in numerical analysis, sparse optimization, and statistics. This
workshop will explore the interactions among these ideas in the hope of
generating new advances.
Talk material is
here but there ain't much. Here are the videos:
Information and Statistics, Andrew R. Barron (Yale University)
Correlation Distillation, Elchanan Mossel (University of California, Berkeley)
Fixed- and Variable-Length Data Compression at Finite Blocklength, Victoria Kostina (California Institute of Technology)
SOS and the Hidden Clique, Andrea Montanari (Stanford University)
Entropy and Thinning of Discrete Random Variables, Oliver Johnson (University of Bristol)
Weak and Strong Moments of l_r-norms of Log-concave Vectors, Rafal Latala (University of Warsaw)
Logarithmic Sobolev Inequalities in Discrete Product Spaces: A Proof by a Transportation Cost Distance, Katalin Marton (Hungarian Academy of Sciences (MTA))
Gaussian Phase Transitions and Conic Intrinsic Volumes: Steining the Steiner formula, Larry Goldstein (University of Southern California)
Eigenvalue Distribution of Optimal Transportation, Bo'az Klartag (Tel Aviv University)
Concentration of Spectral Measures of Random Matrices, Elizabeth Meckes (Case Western Reserve University)
Bounding Marginal Densities via Affine Isoperimetry, Grigorios Paouris (Texas A & M University)
Stability Estimates for the log-Sobolev Inequality, Emanuel Indrei (Carnegie-Mellon University)
Curvature-Dimension Condition for Non-Conventional Dimensions, Emanuel Milman (Technion-Israel Institute of Technology)
On Talagrand's Convolution Conjecture in Gaussian Space, Ronen Eldan (University of Washington)
Random Matrices and Concentration, Van H. Vu (Yale University)
Contraction Estimates for Markov Kernels via Information-Transportation Inequalities, Maxim Raginsky (University of Illinois at Urbana-Champaign)
The Lower Tail of Random Quadratic Forms, via the PAC Bayesian Method, Roberto Imbuzeiro Oliveira (Institute of Pure and Applied Mathematics (IMPA))
How Large is the Norm of a Random Matrix?, Ramon van Handel (Princeton University)
The organizers were:
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment