Thursday, August 20, 2015

The LASSO with Non-linear Measurements is Equivalent to One With Linear Measurements


Consider estimating an unknown, but structured, signal x0Rn from m measurement yi=gi(aTix0), where the ai's are the rows of a known measurement matrix A, and, g is a (potentially unknown) nonlinear and random link-function. Such measurement functions could arise in applications where the measurement device has nonlinearities and uncertainties. It could also arise by design, e.g., gi(x)=sign(x+zi), corresponds to noisy 1-bit quantized measurements. Motivated by the classical work of Brillinger, and more recent work of Plan and Vershynin, we estimate x0 via solving the Generalized-LASSO for some regularization parameter λ>0 and some (typically non-smooth) convex structure-inducing regularizer function. While this approach seems to naively ignore the nonlinear function g, both Brillinger (in the non-constrained case) and Plan and Vershynin have shown that, when the entries of A are iid standard normal, this is a good estimator of x0 up to a constant of proportionality μ, which only depends on g. In this work, we considerably strengthen these results by obtaining explicit expressions for the squared error, for the \emph{regularized} LASSO, that are asymptotically \emph{precise} when m and n grow large. A main result is that the estimation performance of the Generalized LASSO with non-linear measurements is \emph{asymptotically the same} as one whose measurements are linear yi=μaTix0+σzi, with μ=Eγg(γ) and σ2=E(g(γ)μγ)2, and, γ standard normal. To the best of our knowledge, the derived expressions on the estimation performance are the first-known precise results in this context. One interesting consequence of our result is that the optimal quantizer of the measurements that minimizes the estimation error of the LASSO is the celebrated Lloyd-Max quantizer.
 
 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly