Pages

Tuesday, May 31, 2016

Review: RandNLA: Randomized Numerical Linear Algebra


If you've been reading Nuit Blanche for a while, you have heard of RandNLA as there have been about 75 blog entries on the subject. It looks like a review was overdue to present the matter to a wider audience. Petros and Michael who were among the ones starting the field, just did that in a review of the CACM. Here is it:
 
It starts like this:

MATRICES ARE UBIQUITOUS in computer science, statistics, and applied mathematics. An m×n matrix can encode information about m objects (each described by n features), or the behavior of a discretized differential operator on a finite element mesh; an n× n positive-definite matrix can encode the correlations between all pairs of n objects, or the edge-connectivity between all pairs of nodes in a social network; and so on. Motivated largely by technological developments that generate extremely large scientific and Internet datasets, recent years have witnessed exciting developments in the theory and practice of matrix algorithms. Particularly remarkable is the use of randomization —typically assumed to be a property of the input data due to, for example, noise in the data generation mechanisms—as an algorithmic or computational resource for the development of improved algorithms for fundamental matrix problems such as matrix multiplication, least-squares (LS) approximation, low-rank matrix approximation, and Laplacian-based linear equation solvers......

 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment