Friday, January 15, 2010

CS: Collaborative Spectrum Sensing for CR Networks, Estimation with Random Linear Mixing, Asymptotic Analysis of Node-Based Recovery


I think it is important to always remember that the end goal of some of what we are doing, however theoretical sometimes, has clear and present implication on our, sometimes immediate, surroundings. To remind yourself of that, you may want to watch this whole 47 minutes video or simply the amazing excerpt that starts at 7 minutes and 42 seconds where Barbara Schulz recounts that "the worst word of it all was the word permanent...". Finally, I note that most of the sensor technology featured in this video is already included in most iPhones.

In other news, Lianlin Li has decided to copy most of the posts featured here on his blog. Thanks Lianlin. One can always come to the blog and use the "Subscribe by E-MAIL to Nuit Blanche" form on the right hand side of this page to receive new posts directly by E-mail. But if one could do that, then it would mean that one could access the blog in the first place. It's called a Catch-22.

In cognitive radio, spectrum sensing is a key component to detect spectrum holes (i.e., channels not used by any primary users). Collaborative spectrum sensing among the cognitive radio nodes is expected to improve the ability of checking complete spectrum usage states. Unfortunately, due to power limitation and channel fading, available channel sensing information is far from being sufficient to tell the unoccupied channels directly. Aiming at breaking this bottleneck, we apply recent matrix completion techniques to greatly reduce the sensing information needed. We formulate the collaborative sensing problem as a matrix completion subproblem and a joint-sparsity reconstruction subproblem. Results of numerical simulations that validated the effectiveness and robustness of the proposed approach are presented. In particular, in noiseless cases, when number of primary user is small, exact detection was obtained with no more than 8% of the complete sensing information, whilst as number of primary user increases, to achieve a detection rate of 95.55%, the required information percentage was merely 16.8%.

We consider the general problem of estimating a random vector from measurements generated by a linear transform followed by a componentwise probabilistic measurement channel. We call this the linear mixing estimation problem, since the linear transform couples the components of the input and output vectors. The main contribution of this paper is a general algorithm called linearized belief propagation (LBP) that iteratively "decouples" the vector minimum mean-squared error (MMSE) estimation problem into a sequence of componentwise scalar problems. The algorithm is an extension of Guo and Wang's relaxed BP method to the case of general (non-AWGN) output channels. Although standard BP is generally computationally tractable only for sparse mixing matrices, LBP (like relaxed BP) uses Gaussian approximations and can be applied to arbitrary mixing matrices.
We extend the density evolution (DE) analysis of relaxed BP to obtain a simple scalar characterization of the componentwise behavior of LBP in the limit of large, random transforms. The scalar equivalent model provides simple and asymptotically exact formulae for various performance metrics of LBP. We also strengthen earlier results on the convergence of the DE updates. As with relaxed BP, the convergence result in turn shows that LBP is provably asymptotically mean-squared optimal for AWGN channels when a certain fixed point of the DE equations is unique. Applications are presented to compressed sensing and estimation with bounded noise.
In this paper, we propose a general framework for the asymptotic analysis of node-based verification-based algorithms. In our analysis we tend the signal length $n$ to infinity. We also let the number of non-zero elements of the signal $k$ scale linearly with $n$. Using the proposed framework, we study the asymptotic behavior of the recovery algorithms over random sparse matrices (graphs) in the context of compressive sensing. Our analysis shows that there exists a success threshold on the density ratio $k/n$, before which the recovery algorithms are successful, and beyond which they fail. This threshold is a function of both the graph and the recovery algorithm. We also demonstrate that there is a good agreement between the asymptotic behavior of recovery algorithms and finite length simulations for moderately large values of $n$.

No comments:

Printfriendly