Pages

Thursday, August 30, 2012

Implementation: BLENDENPIK: Supercharging LAPACK's Least-Squares Solver



.
Is this what you are looking for:: http://pdos.csail.mit.edu/~petar/papers/blendenpik-v1.pdf
This is random preconditioning, which works well in practice.

and then, I wanted to follow up on it but did not. Here is the paper: BLENDENPIK: Supercharging LAPACK's Least-Squares Solver by Haim Avron, Petar Maymounkov and Sivan Toledo. The abstract reads:
Abstract. Several innovative random-sampling and random-mixing techniques for solving problems in linear algebra have been proposed in the last decade, but they have not yet made a signi cant impact on numerical linear algebra. We show that by using an high quality implementation of one of these techniques we obtain a solver that performs extremely well in the traditional yardsticks of numerical linear algebra: it is signi cantly faster than high-performance implementations of existing state-of-the-art algorithms, and it is numerically backward stable. More speci cally, we describe a least-square solver for dense highly overdetermined systems that achieves residuals similar to those of direct QR factorization based solvers (lapack), outperforms lapack by large factors, and scales signi cantly better than any QR-based solver.



The attendant code is here. If you recall this is inline with the subject of Streaming Data Mining


Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment