## Saturday, December 30, 2017

### Lectures on Randomized Numerical Linear Algebra by Petros Drineas and Michael Mahoney

Here is a welcome addition to the Randomized Numerical linear Algebra / RandNLA tag.

Lectures on Randomized Numerical Linear Algebra by Petros Drineas, Michael W. Mahoney

This chapter is based on lectures on Randomized Numerical Linear Algebra from the 2016 Park City Mathematics Institute summer school on The Mathematics of Data.
Here is the table of content:

1 Introduction
2 Linear Algebra
2.1 Basics
2.2 Norms
2.3 Vector norms
2.4 Induced matrix norms
2.5 The Frobenius norm
2.6 The Singular Value Decomposition
2.7 SVD and Fundamental Matrix Spaces
2.8 Matrix Schatten norms
2.9 The Moore-Penrose pseudoinverse
2.10 References
3 Discrete Probability
3.1 Random experiments: basics
3.2 Properties of events
3.3 The union bound
3.4 Disjoint events and independent events
3.5 Conditional probability
3.6 Random variables
3.7 Probability mass function and cumulative distribution function
3.8 Independent random variables
3.9 Expectation of a random variable
3.10 Variance of a random variable
3.11 Markov’s inequality
3.12 The Coupon Collector Problem
3.13 References
4 Randomized Matrix Multiplication
4.1 Analysis of the RANDMATRIXMULTIPLY algorithm
4.2 Analysis of the algorithm for nearly optimal probabilities
4.3 Bounding the two norm
4.4 References
5 RandNLA Approaches for Regression Problems
5.2 The main algorithm and main theorem
5.3 RandNLA algorithms as preconditioners
5.4 The proof of Theorem 47
5.5 The running time of the RANDLEASTSQUARES algorithm
5.6 References
6 A RandNLA Algorithm for Low-rank Matrix Approximation
6.1 The main algorithm and main theorem
6.2 An alternative expression for the error
6.3 A structural inequality
6.4 Completing the proof of Theorem 80
6.4.1 Bounding Expression (104)
6.5 Running time
6.6 References

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.