Noise-resilient group testing: Limitations and constructions. Cheraghchi, M. In Proceedings of the 17th International Symposium on Fundamentals of Computation Theory (FCT), pages 62–73, 2009. Extended version in Discrete Applied Mathematics. arXiv manuscript published in 2008.
Noise-resilient group testing: Limitations and constructions [link]Link  Noise-resilient group testing: Limitations and constructions [link]Paper  doi  abstract   bibtex   
We study combinatorial group testing schemes for learning $d$-sparse boolean vectors using highly unreliable disjunctive measurements. We consider an adversarial noise model that only limits the number of false observations, and show that any noise-resilient scheme in this model can only approximately reconstruct the sparse vector. On the positive side, we take this barrier to our advantage and show that approximate reconstruction (within a satisfactory degree of approximation) allows us to break the information theoretic lower bound of $Ω̃(d^2 łog n)$ that is known for exact reconstruction of $d$-sparse vectors of length $n$ via non-adaptive measurements, by a multiplicative factor $Ω̃(d)$. Specifically, we give simple randomized constructions of non-adaptive measurement schemes, with $m=O(d łog n)$ measurements, that allow efficient reconstruction of $d$-sparse vectors up to $O(d)$ false positives even in the presence of $δ m$ false positives and $O(m/d)$ false negatives within the measurement outcomes, for any constant $δ < 1$. We show that, information theoretically, none of these parameters can be substantially improved without dramatically affecting the others. Furthermore, we obtain several explicit constructions, in particular one matching the randomized trade-off but using $m = O(d^{1+o(1)} łog n)$ measurements. We also obtain explicit constructions that allow fast reconstruction in time $\mathrm{poly}(m)$, which would be sublinear in $n$ for sufficiently sparse vectors. The main tool used in our construction is the list-decoding view of randomness condensers and extractors. An immediate consequence of our result is an adaptive scheme that runs in only two non-adaptive rounds and exactly reconstructs any $d$-sparse vector using a total $O(d łog n)$ measurements, a task that would be impossible in one round and fairly easy in $O(łog(n/d))$ rounds.
@INPROCEEDINGS{ref:conf:Che09:testing,
  author =	 {Mahdi Cheraghchi},
  title =	 "Noise-resilient group testing: Limitations and
                  constructions",
  booktitle =	 "Proceedings of the 17th {International Symposium on
                  Fundamentals of Computation Theory (FCT)}",
  year =	 2009,
  pages =	 "62--73",
  doi =		 "10.1007/978-3-642-03409-1_7",
  url_Link =
                  "https://link.springer.com/chapter/10.1007/978-3-642-03409-1_7",
  keywords =	 "Group testing, Randomness condensers, Extractors,
                  List decoding",
  note =	 {Extended version in {Discrete Applied
                  Mathematics}. arXiv manuscript published in 2008.},
  url_Paper =	 {https://arxiv.org/abs/0811.2609},
  abstract =	 {We study combinatorial group testing schemes for
                  learning $d$-sparse boolean vectors using highly
                  unreliable disjunctive measurements. We consider an
                  adversarial noise model that only limits the number
                  of false observations, and show that any
                  noise-resilient scheme in this model can only
                  approximately reconstruct the sparse vector. On the
                  positive side, we take this barrier to our advantage
                  and show that approximate reconstruction (within a
                  satisfactory degree of approximation) allows us to
                  break the information theoretic lower bound of
                  $\tilde{\Omega}(d^2 \log n)$ that is known for exact
                  reconstruction of $d$-sparse vectors of length $n$
                  via non-adaptive measurements, by a multiplicative
                  factor $\tilde{\Omega}(d)$.  Specifically, we give
                  simple randomized constructions of non-adaptive
                  measurement schemes, with $m=O(d \log n)$
                  measurements, that allow efficient reconstruction of
                  $d$-sparse vectors up to $O(d)$ false positives even
                  in the presence of $\delta m$ false positives and
                  $O(m/d)$ false negatives within the measurement
                  outcomes, for \textit{any} constant $\delta < 1$.
                  We show that, information theoretically, none of
                  these parameters can be substantially improved
                  without dramatically affecting the others.
                  Furthermore, we obtain several explicit
                  constructions, in particular one matching the
                  randomized trade-off but using $m = O(d^{1+o(1)}
                  \log n)$ measurements. We also obtain explicit
                  constructions that allow fast reconstruction in time
                  $\mathrm{poly}(m)$, which would be sublinear in $n$
                  for sufficiently sparse vectors. The main tool used
                  in our construction is the list-decoding view of
                  randomness condensers and extractors.  An immediate
                  consequence of our result is an adaptive scheme that
                  runs in only two non-adaptive \textit{rounds} and
                  exactly reconstructs any $d$-sparse vector using a
                  total $O(d \log n)$ measurements, a task that would
                  be impossible in one round and fairly easy in
                  $O(\log(n/d))$ rounds.  }
}

Downloads: 0