## Monday, July 15, 2013

### Group Testing: GROTESQUE: Noisy Group Testing (Quick and Efficient), Efficient Probabilistic Group Testing Based on Traitor Tracing, Group testing algorithms: bounds and simulations

So today, our emphasis is on group testing, here is what we could find in the past two months on the subject on Arxiv. I sure would love more of the authors to make an implementation of their work available especially since now it looks like we have gone beyond the asymptotics games.
\

Group-testing refers to the problem of identifying (with high probability) a (small) subset of $D$ defectives from a (large) set of $N$ items via a "small" number of "pooled" tests. For ease of presentation in this work we focus on the regime when $D = \cO{N^{1-\gap}}$ for some $\gap > 0$. The tests may be noiseless or noisy, and the testing procedure may be adaptive (the pool defining a test may depend on the outcome of a previous test), or non-adaptive (each test is performed independent of the outcome of other tests). A rich body of literature demonstrates that $\Theta(D\log(N))$ tests are information-theoretically necessary and sufficient for the group-testing problem, and provides algorithms that achieve this performance. However, it is only recently that reconstruction algorithms with computational complexity that is sub-linear in $N$ have started being investigated (recent work by \cite{GurI:04,IndN:10, NgoP:11} gave some of the first such algorithms). In the scenario with adaptive tests with noisy outcomes, we present the first scheme that is simultaneously order-optimal (up to small constant factors) in both the number of tests and the decoding complexity ($\cO{D\log(N)}$ in both the performance metrics). The total number of stages of our adaptive algorithm is "small" ($\cO{\log(D)}$). Similarly, in the scenario with non-adaptive tests with noisy outcomes, we present the first scheme that is simultaneously near-optimal in both the number of tests and the decoding complexity (via an algorithm that requires $\cO{D\log(D)\log(N)}$ tests and has a decoding complexity of {${\cal O}(D(\log N+\log^{2}D))$}. Finally, we present an adaptive algorithm that only requires 2 stages, and for which both the number of tests and the decoding complexity scale as {${\cal O}(D(\log N+\log^{2}D))$}. For all three settings the probability of error of our algorithms scales as $\cO{1/(poly(D)}$.

Inspired by recent results from collusion-resistant traitor tracing, we provide a framework for constructing efficient probabilistic group testing schemes. In the traditional group testing model, our scheme asymptotically requires T ~ 2 K ln N tests to find (with high probability) the correct set of K defectives out of N items. Several other models are also considered, such as some noisy group testing and threshold group testing models, but we emphasize that this framework can be applied to many other variants of the classical model as well, both in adaptive and in non-adaptive settings.

We consider the problem of non-adaptive noiseless group testing. We develop several new algorithms and analyse their probability of success. In particular, we describe the Definite Defectives (DD) algorithm, and prove bounds on its probability of success, comparing it to the best known algorithms in the literature. Further, we describe the SCOMP algorithm, and perform simulations which show that it performs even better than the DD algorithm. Finally, we derive upper bounds on the success probability for any algorithms based on a Bernoulli sampled test matrix. We analyse the normalized number of tests required, in terms of information theoretic capacity. In certain regimes we argue that in an asymptotic sense the DD algorithm is optimal, and that less sparse problems (more defectives) have a larger `adaptivity gap'.
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.