Approximate Sparse Linear Regression by Sariel Har-Peled, Piotr Indyk, Sepideh Mahabadi

In the Sparse Linear Regression (SLR) problem, given ad×n matrixM and ad -dimensional vectorq , we want to compute ak -sparse vectorτ such that the error||Mτ−q|| is minimized. In this paper, we present algorithms and conditional lower bounds for several variants of this problem. In particular, we consider (i) the Affine SLR where we add the constraint that∑iτi=1 and (ii) the Convex SLR where we further add the constraint thatτ≥0 . Furthermore, we consider (i) the batched (offline) setting, where the matrixM and the vectorq are given as inputs in advance, and (ii) the query(online) setting, where an algorithm preprocesses the matrixM to quickly answer such queries. All of the aforementioned variants have been well-studied and have many applications in statistics, machine learning and sparse recovery.

We consider the approximate variants of these problems in the "low sparsity regime" where the value of the sparsity boundk is low. In particular, we show that the online variant of all three problems can be solved with query timeO~(nk−1) . This provides non-trivial improvements over the naive algorithm that exhaustively searches all(nk) subsetsB . We also show that solving the offline variant of all three problems, would require an exponential dependence of the formΩ~(nk/2/ek) , under a natural complexity-theoretic conjecture. Improving this lower bound for the case ofk=4 would imply a nontrivial lower bound for the famous Hopcroft's problem. Moreover, solving the offline variant of affine SLR ino(nk−1) would imply an upper bound ofo(nd) for the problem of testing whether a given set ofn points in ad -dimensional space is degenerate. However, this is conjectured to requireΩ(nd) time.

**Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment