Pages

Monday, November 23, 2015

PCMI Summer School, "The Mathematics of Data", June 30 – July 20, 2016, Utah

Michael sent me the following last week:
Hi Igor,


Hope all is well with you.


I wanted to let you know about a summer school the "Mathematics of Data" that Anna Gilbert, John Duchi, and I are running at the Park City Mathematics Institute next summer, June 30 to July 20.

Attached is a pdf of the program announcement, and here is a link with more information:



I imagine that this would be of interest to many of your followers, so would it be possible for you to advertise this program?

Thanks


Thanks Michael ! As far as I can tell out of the few programs, two are organized by Michael, Anna and John, there are:

Here is the program for the Graduate Summer school:
The Mathematics of Data
The Graduate Summer School bridges the gap between a general graduate education in mathematics and the specific preparation necessary to do research on problems of current interest. In general, these students will have completed their first year, and in some cases, may already be working on a thesis. While a majority of the participants will be graduate students, some postdoctoral scholars and researchers may also be interested in attending.
The main activity of the Graduate Summer School will be a set of intensive short lectures offered by leaders in the field, designed to introduce students to exciting, current research in mathematics. These lectures will not duplicate standard courses available elsewhere. Each course will consist of lectures with problem sessions. Course assistants will be available for each lecture series. The participants of the Graduate Summer School meet three times each day for lectures, with one or two problem sessions scheduled each day as well.
In order to derive insight from data, one needs to perform computations, and one needs to perform statistical inference.  Both of these tasks raise important and fundamental mathematical questions, especially when one considers realistic sparsity and noise properties, realistic size scales, realistic temporal properties, etc.  These questions are often considered outside traditional mathematics departments, and they present challenges to the theoretical foundations of related methodological areas such as computer science and statistics.  This requires revisiting traditional and novel areas of applied mathematics to determine which subset of those areas can be used to help establish the theoretical foundations of modern large-scale data analysis.  Topics will include Randomized Linear Algebra, Topological Data Analysis, Theoretical Computer Science, Theoretical Statistics, Functional Analysis, Scientific computing, and Optimization.  The goal of the program is to present these ideas in perspective, covering the necessary background and leading up to the recent progress and open problems.
Student preparation: We seek motivated students interested in the mathematical, e.g., algorithmic and statistical, aspects of modern large-scale data analysis, including theoretically-inclined students from computer science, statistics, applied mathematics, and related areas.  Though familiarity with some of the topics listed above would be helpful, the formal prerequisites are limited to the content of standard introductory courses in linear algebra, probability, and optimization.
The 26th Annual PCMI Summer Session will be held June 30 – July 20, 2016.
Click HERE to apply to the Graduate Summer School program.

2016 Organizers
John Duchi, Stanford University; Anna Gilbert, University of Michigan; and Michael Mahoney, University of California, Berkeley

2016 Graduate Summer School Lecturers
Petros Drineas, Rensselaer Polytechnic Institute
RandNLA: Randomization in Numerical Linear Algebra
The introduction of randomization in the design and analysis of algorithms for matrix computations (such as matrix multiplication, least-squares regression, the Singular Value Decomposition (SVD), etc.) over the past 15 years provided a new paradigm and a complementary perspective to traditional numerical linear algebra approaches. These novel approaches were motivated by technological developments in many areas of scientific research that permit the automatic generation of large data sets, which are often modeled as matrices.
We will outline how such approaches can be used to approximately solve problems ranging from matrix multiplication and the Singular Value Decomposition (SVD) of matrices to the Column Subset Selection Problem and the CX decomposition. Application of the proposed algorithms to data analysis tasks (with a particular focus in population genetics) will also be discussed

Cynthia Dwork, Microsoft Research
Course description coming soon!

Robert Ghrist, University of Pennsylvania
Topological Data Analysis
This course will cover the background, techniques, and applications of Topological Data Analysis. Beginning with an introduction to the classical tools of algebraic topology, we will progress through applications to point clouds, persistence, networks, and more, with far-ranging applications. No background in topology will be assumed.
Piotr Indyk, Massachusetts Institute of Technology
Recent Developments in the Sparse Fourier Transform
The discrete Fourier transform (DFT) is a fundamental component of numerous computational techniques in signal processing and scientific computing. The most popular means of computing the DFT is the fast Fourier transform (FFT). However, with the emergence of big data, the “fast” in FFT is often no longer fast enough. In addition, in many applications it is hard to acquire a sufficient amount of data to compute the desired Fourier transform in the first place.
The Sparse Fourier Transform (SFT) is based on the insight that many real-world signals are sparse –i.e., most of the frequencies have negligible contribution to the overall signal. SFT exploits this insight by computing a compressed Fourier transform in time proportional to the data sparsity, not the data size. Furthermore, it uses only a subset of the signal.
The goal of this talk is to survey recent developments in this area and explain the basic techniques with examples and applications. Further resources are available at: http://groups.csail.mit.edu/netmit/sFFT/.
Mauro Maggioni, Duke University
Course description coming soon!

Gunnar Martinsson, University of Colorado
Course description coming soon!

Kunal Talwar, Microsoft Research
Course description coming soon!
Roman Vershynin, University of Michigan
Course description coming soon!

Stephen J. Wright, University of Wisconsin
Course description coming soon!
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment