
You thought you'd go home for the week-end without much to read on your new shiny iPad, well forget that. Today two workshops have released some of the slides presentation made there. To get full publicity for the event, for their sponsors or for the presenters, the process of getting slides from the presenters and making them available should not be an after thought. Enjoy!
The workshop on
Sparsity and Computation at the Hausdorff-center for Mathematics took place three weeks ago and it looks like we now have access to some of the presentations made there. Here there are:
Programme and Slides
Please click on the title to obtain the pdf-file of the talk (if available). Monday, June 7, 2010
- 8:30-9:00 Registration
- 9:00-9:10 Opening
- 9:10-10:00 Ingrid Daubechies, Princeton University
- Animation, Teeth and Skeletons ...
- 10:10-11:00 Erich Novak, University of Jena
- How can we obtain tractability of multivariate problems?
- 11:00-11:30 Coffee Break
- 11:30-12:00 Gitta Kutyniok, University of Osnabrück
- Geometric Separation by Single-Pass Alternating Thresholding
- 12:00-12:30 Lawrence Carin, Duke University
- Statistical nonlinear matrix completion
- 12:30-15:00 Lunch Break
- 15:00-15:30 Wolfgang Hackbusch, MPI Leipzig
- 1D data compression by tensor based methods
- 15:30-16:00 Rob Stevenson, KdVI Amsterdam
- Space-Time Adaptive Wavelet Methods for Parabolic Evolution Problems
- 16:00-16:30 Angela Kunoth, University of Paderborn
- Space-Time Adaptive Wavelet Methods for Control Problems Constrained by Parabolic PDEs
- 16:30-17:00 Coffee Break
- 17:00-17:30 Stephan Dahlke, University of Marburg
- Multilevel Preconditioning and Adaptive Sparse Solution of Inverse Problems
- 17:30-18:00 Gerd Teschke, Hochschule Neubrandenburg
- Sparse recovery and compressed sensing in inverse problems
- 18:15 Reception
Tuesday, June 8, 2010
- 10:00-10:50 Emmanuel Candès, Stanford University
- Robust Principal Component Analysis?
- 10:50-11:30 Coffee Break
- 11:30-12:00 Benjamin Recht, University of Wisconsin
- The Convex Geometry of Inverse Problems
- 12:00-12:30 Rachel Ward, New York University
- Sparse Legendre expansions via l1 minimization
- 12:30-15:00 Lunch Break
- 15:00-15:30 Stefan Kunis, Technical University of Chemnitz & Helmholtz Center Munich
- Sparse and Fast Fourier Transforms in Biomedical Imaging
- 15:30-16:00 Yonina Eldar, Technion Haifa
- Xampling: Analog-to-digital at Sub-Nyquist rates
- 16:00-16:30 Mauro Maggioni, Duke University
- Multiscale Geometric Analysis of Data Sets
- 16:30-17:00 Coffee Break
- 17:00-17:30 Przemyslaw Wojtaszczyk, University of Warsaw
- Approximation of functions of few variables in high dimensions
- 17:30-18:00 Volodya Temlyakov, University of South Carolina
- Orthogonal Super Greedy Algorithm and Applications in Compressed Sensing
Wednesday, June 9, 2010
- 9:00-9:50 Jean-Luc Guermond, Texas A&M University
- Entropy viscosity for nonlinear conservation laws
- 10:00-10:50 Alain Pajor, Université Paris Est - Marne-la-Vallée
- Random polytopes and neighborliness
- 10:50-11:30 Coffee Break
- 11:30-12:00 Nicole Tomczak-Jaegermann, University of Alberta
- On random matrices with independent log-concave columns
- 12:00-12:30 Bojan Popov, Texas A&M University
- Nonlinear Approximation Techniques Using L1
- 12:30-14:00 Lunch Break
- 14:00-14:50 Joel Tropp, California Institute of Technology
- User-Friendly Tail Bounds for Sums of Random Matrices
- 15:15 Excursion
Thursday, June 10, 2010
- 9:00-9:50 Wolfgang Dahmen, RWTH Aachen
- Convergence Rates for Greedy Algorithms in Reduced Basis Methods
- 10:00-10:50 Christoph Schwab, ETH Zurich
- Convergence Rates for Sparse Adaptive Tensor Approximations of parametric and stochastic PDEs
- 10:50-11:30 Coffee Break
- 11:30-12:20 Albert Cohen, Université Pierre et Marie Curis Paris
- Analysis of the reduced basis method for parametric elliptic PDEs
- 12:20-15:00 Lunch Break
- 15:00-15:30 Mark Iwen, University of Minnesota
- Sparse Fourier Approximation in High Dimensions
- 15:30-16:00 Karin Schnass, RICAM Linz
- Dictionary Identification - Sparse Matrix-Factorisation via l1-Minimisation
- 16:00-16:30 Maryam Fazel, University of Washington
- A nullspace approach to low-rank matrix recovery
- 16:30-17:00 Coffee Break
- 17:00-17:30 Andrea Montanari, Stanford University
- Message passing algorithms, random convex problems, and the risk of the LASSO
- 17:30-18:00 Michael Elad, Technion Haifa
- Topics in Minimum-Mean-Squared-Error (MMSE) Estimation in Sparse Approximation
- 20:30 Dinner at restaurant Im Stiefel (Bonngasse 30)
Friday, June 11, 2010
- 9:00-9:50 Piotr Indyk, MIT
- Sparse Recovery for Earth Mover Distance
- 10:00-10:30 Thomas Blumensath, University of Southhampton
- Three Generalisations of Compressed Sensing
- 10:30-11:00 Özgür Yilmaz, University of British Columbia
- Quantization of Compressed Sensing Measurements
- 11:00-11:30 Coffee Break
- 11:30-12:00 Justin Romberg, Georgia Institute of Technology
- Random coding for forward modeling
- 12:00-12:30 Gabriele Steidl, University of Mannheim
- Dithering by Differences of Convex Functions
- 12:30-14:40 Lunch Break
- 14:40-15:30 Michael Griebel, University of Bonn
- Dimension-wise integration of high-dimensional functions with applications to finance
- 15:30-16:15 Coffee Break
- 16:15-17:15 Roman Vershynin, University of Michigan
- Jointly with the Bonn Mathematical Colloquium
- Non-asymptotic theory of random matrices and sparsity
The workshop on the
2010 Modern Massive Data Sets took place two weeks ago at Stanford. The
site has a list of the talks but all the links link to a placeholder. Some talks are available however and so I am featuring only those talks that have an actual presentation attached to them. They can be found
here as well.
Tuesday, June 15, 2010. Theme: Large-scale Data and Large-scale Computation
Time | Talk |
8:00 - 10:00 | Breakfast and Registration -- outside Cubberley Auditorium (at the Stanford School of Education, just off the Main Quad)
|
9:45 - 10:00 | Welcome and Opening Remarks -- in Cubberley Auditorium
|
10:00 - 11:00 | Tutorial: Peter Norvig Internet-Scale Data Analysis |
11:00 - 11:30 | Ashok Srivastava Virtual Sensors and Large-Scale Gaussian Processes |
11:30 - 12:00 | John Langford A method for Parallel Online Learning |
2:00 - 3:00 | Tutorial: John Gilbert Combinatorial Scientific Computing: Experience and Challenges |
3:00 - 3:30 | Deepak Agarwal Estimating Rates of Rare Events through Multiple Hierarchies |
3:30 - 4:00 | James Demmel Minimizing Communication in Linear Algebra |
4:30 - 5:00 | Dmitri Krioukov Hyperbolic mapping of complex networks |
5:00 - 5:30 | Mehryar Mohri Matrix approximation for large-scale learning |
5:30 - 6:00 | David Bader Massive Scale Analytics of Streaming Social Networks |
6:00 - 6:30 | Ely Porat Fast Pseudo-Random Fingerprints |
Wednesday, June 16, 2010. Theme: Networked Data and Algorithmic Tools
Thursday, June 17, 2010. Theme: Spectral Methods and Sparse Matrix Methods
Time | Talk |
9:00 - 10:00 | Tutorial: Petros Drineas Randomized Algorithms in Linear Algebra and Large Data Applications |
10:00 - 10:30 | Gunnar Martinsson Randomized methods for computing the SVD/PCA of very large matrices |
11:00 - 11:30 | Ilse Ipsen Numerical reliability of randomized algorithms |
11:30 - 12:00 | Patrick Wolfe Randomized Algorithms and Sampling Schemes for Large Matrices |
12:00 - 12:30 | Alexandre d'Aspremont Subsampling, Spectral Methods & Semidefinite Programming |
2:30 - 3:00 | Gary Miller Specialized System Solvers for very large Systems: Theory and Practice |
3:00 - 3:30 | John Wright and Emmanuel Candes Robust Principal Component Analysis? |
3:30 - 4:00 | Alon Orlitsky Estimation, Prediction, and Classification over Large Alphabets |
4:30 - 5:00 | Ken Clarkson Numerical Linear Algebra in the Streaming Model |
5:00 - 5:30 | David Woodruff Fast Lp Regression in Data Streams |
Credit: Presentation of Justin Romberg.
No comments:
Post a Comment