Noise-resilient group testing: Limitations and constructions. Cheraghchi, M. Discrete Applied Mathematics, 161(1):81–95, 2013. Preliminary version in Proceedings of the FCT 2009. arXiv manuscript published in 2008.
Noise-resilient group testing: Limitations and constructions [link]Link  Noise-resilient group testing: Limitations and constructions [link]Paper  doi  abstract   bibtex   
We study combinatorial group testing schemes for learning $d$-sparse boolean vectors using highly unreliable disjunctive measurements. We consider an adversarial noise model that only limits the number of false observations, and show that any noise-resilient scheme in this model can only approximately reconstruct the sparse vector. On the positive side, we take this barrier to our advantage and show that approximate reconstruction (within a satisfactory degree of approximation) allows us to break the information theoretic lower bound of $Ω̃(d^2 łog n)$ that is known for exact reconstruction of $d$-sparse vectors of length $n$ via non-adaptive measurements, by a multiplicative factor $Ω̃(d)$. Specifically, we give simple randomized constructions of non-adaptive measurement schemes, with $m=O(d łog n)$ measurements, that allow efficient reconstruction of $d$-sparse vectors up to $O(d)$ false positives even in the presence of $δ m$ false positives and $O(m/d)$ false negatives within the measurement outcomes, for any constant $δ < 1$. We show that, information theoretically, none of these parameters can be substantially improved without dramatically affecting the others. Furthermore, we obtain several explicit constructions, in particular one matching the randomized trade-off but using $m = O(d^{1+o(1)} łog n)$ measurements. We also obtain explicit constructions that allow fast reconstruction in time $\mathrm{poly}(m)$, which would be sublinear in $n$ for sufficiently sparse vectors. The main tool used in our construction is the list-decoding view of randomness condensers and extractors. An immediate consequence of our result is an adaptive scheme that runs in only two non-adaptive rounds and exactly reconstructs any $d$-sparse vector using a total $O(d łog n)$ measurements, a task that would be impossible in one round and fairly easy in $O(łog(n/d))$ rounds.
@ARTICLE{ref:Che13:testing,
  author =	 {Mahdi Cheraghchi},
  title =	 "Noise-resilient group testing: Limitations and
                  constructions",
  journal =	 "Discrete Applied Mathematics",
  volume =	 161,
  number =	 1,
  pages =	 "81--95",
  year =	 2013,
  doi =		 "10.1016/j.dam.2012.07.022",
  url_Link =
                  "http://www.sciencedirect.com/science/article/pii/S0166218X12003009",
  keywords =	 "Group testing, Randomness condensers, Extractors,
                  List decoding",
  note =	 {Preliminary version in Proceedings of the {FCT
                  2009}. arXiv manuscript published in 2008.},
  url_Paper =	 {https://arxiv.org/abs/0811.2609},
  abstract =	 {We study combinatorial group testing schemes for
                  learning $d$-sparse boolean vectors using highly
                  unreliable disjunctive measurements. We consider an
                  adversarial noise model that only limits the number
                  of false observations, and show that any
                  noise-resilient scheme in this model can only
                  approximately reconstruct the sparse vector. On the
                  positive side, we take this barrier to our advantage
                  and show that approximate reconstruction (within a
                  satisfactory degree of approximation) allows us to
                  break the information theoretic lower bound of
                  $\tilde{\Omega}(d^2 \log n)$ that is known for exact
                  reconstruction of $d$-sparse vectors of length $n$
                  via non-adaptive measurements, by a multiplicative
                  factor $\tilde{\Omega}(d)$.  Specifically, we give
                  simple randomized constructions of non-adaptive
                  measurement schemes, with $m=O(d \log n)$
                  measurements, that allow efficient reconstruction of
                  $d$-sparse vectors up to $O(d)$ false positives even
                  in the presence of $\delta m$ false positives and
                  $O(m/d)$ false negatives within the measurement
                  outcomes, for \textit{any} constant $\delta < 1$.
                  We show that, information theoretically, none of
                  these parameters can be substantially improved
                  without dramatically affecting the others.
                  Furthermore, we obtain several explicit
                  constructions, in particular one matching the
                  randomized trade-off but using $m = O(d^{1+o(1)}
                  \log n)$ measurements. We also obtain explicit
                  constructions that allow fast reconstruction in time
                  $\mathrm{poly}(m)$, which would be sublinear in $n$
                  for sufficiently sparse vectors. The main tool used
                  in our construction is the list-decoding view of
                  randomness condensers and extractors.  An immediate
                  consequence of our result is an adaptive scheme that
                  runs in only two non-adaptive \textit{rounds} and
                  exactly reconstructs any $d$-sparse vector using a
                  total $O(d \log n)$ measurements, a task that would
                  be impossible in one round and fairly easy in
                  $O(\log(n/d))$ rounds.  }
}

Downloads: 0