Noise-resilient group testing: Limitations and constructions. Cheraghchi, M. In Proceedings of the 17th International Symposium on Fundamentals of Computation Theory (FCT), pages 62–73, 2009. Extended version in Discrete Applied Mathematics. arXiv manuscript published in 2008.Link Paper doi abstract bibtex We study combinatorial group testing schemes for learning $d$-sparse boolean vectors using highly unreliable disjunctive measurements. We consider an adversarial noise model that only limits the number of false observations, and show that any noise-resilient scheme in this model can only approximately reconstruct the sparse vector. On the positive side, we take this barrier to our advantage and show that approximate reconstruction (within a satisfactory degree of approximation) allows us to break the information theoretic lower bound of $Ω̃(d^2 łog n)$ that is known for exact reconstruction of $d$-sparse vectors of length $n$ via non-adaptive measurements, by a multiplicative factor $Ω̃(d)$. Specifically, we give simple randomized constructions of non-adaptive measurement schemes, with $m=O(d łog n)$ measurements, that allow efficient reconstruction of $d$-sparse vectors up to $O(d)$ false positives even in the presence of $δ m$ false positives and $O(m/d)$ false negatives within the measurement outcomes, for any constant $δ < 1$. We show that, information theoretically, none of these parameters can be substantially improved without dramatically affecting the others. Furthermore, we obtain several explicit constructions, in particular one matching the randomized trade-off but using $m = O(d^{1+o(1)} łog n)$ measurements. We also obtain explicit constructions that allow fast reconstruction in time $\mathrm{poly}(m)$, which would be sublinear in $n$ for sufficiently sparse vectors. The main tool used in our construction is the list-decoding view of randomness condensers and extractors. An immediate consequence of our result is an adaptive scheme that runs in only two non-adaptive rounds and exactly reconstructs any $d$-sparse vector using a total $O(d łog n)$ measurements, a task that would be impossible in one round and fairly easy in $O(łog(n/d))$ rounds.
@INPROCEEDINGS{ref:conf:Che09:testing,
author = {Mahdi Cheraghchi},
title = "Noise-resilient group testing: Limitations and
constructions",
booktitle = "Proceedings of the 17th {International Symposium on
Fundamentals of Computation Theory (FCT)}",
year = 2009,
pages = "62--73",
doi = "10.1007/978-3-642-03409-1_7",
url_Link =
"https://link.springer.com/chapter/10.1007/978-3-642-03409-1_7",
keywords = "Group testing, Randomness condensers, Extractors,
List decoding",
note = {Extended version in {Discrete Applied
Mathematics}. arXiv manuscript published in 2008.},
url_Paper = {https://arxiv.org/abs/0811.2609},
abstract = {We study combinatorial group testing schemes for
learning $d$-sparse boolean vectors using highly
unreliable disjunctive measurements. We consider an
adversarial noise model that only limits the number
of false observations, and show that any
noise-resilient scheme in this model can only
approximately reconstruct the sparse vector. On the
positive side, we take this barrier to our advantage
and show that approximate reconstruction (within a
satisfactory degree of approximation) allows us to
break the information theoretic lower bound of
$\tilde{\Omega}(d^2 \log n)$ that is known for exact
reconstruction of $d$-sparse vectors of length $n$
via non-adaptive measurements, by a multiplicative
factor $\tilde{\Omega}(d)$. Specifically, we give
simple randomized constructions of non-adaptive
measurement schemes, with $m=O(d \log n)$
measurements, that allow efficient reconstruction of
$d$-sparse vectors up to $O(d)$ false positives even
in the presence of $\delta m$ false positives and
$O(m/d)$ false negatives within the measurement
outcomes, for \textit{any} constant $\delta < 1$.
We show that, information theoretically, none of
these parameters can be substantially improved
without dramatically affecting the others.
Furthermore, we obtain several explicit
constructions, in particular one matching the
randomized trade-off but using $m = O(d^{1+o(1)}
\log n)$ measurements. We also obtain explicit
constructions that allow fast reconstruction in time
$\mathrm{poly}(m)$, which would be sublinear in $n$
for sufficiently sparse vectors. The main tool used
in our construction is the list-decoding view of
randomness condensers and extractors. An immediate
consequence of our result is an adaptive scheme that
runs in only two non-adaptive \textit{rounds} and
exactly reconstructs any $d$-sparse vector using a
total $O(d \log n)$ measurements, a task that would
be impossible in one round and fairly easy in
$O(\log(n/d))$ rounds. }
}
Downloads: 0
{"_id":"JAHjbtoaCHRKs6ceW","bibbaseid":"cheraghchi-noiseresilientgrouptestinglimitationsandconstructions-2009","authorIDs":["2n8MNophuzbeevTa8","3NEcSaujokmJYSDaa","3tFWxWs2qWeYAZx9a","4QNcMTdRiWr2gs8Sk","5KoQWR3vSjnsoZNz5","5i4QHRc5LGio8Mf5u","62bYDgAFwCxaQ4Q9T","685mTysGDdQJKGxEE","6sX76eTffL7p76peN","8NLx3B3FAvaK54hSK","9NZpjMJLG7dNWroGm","9aD4MPX9ELhsyJmaR","9aFgrqcc4j28kZn8n","A9wAgP7TPK9tw28qY","BJ6h7zrsT3L89RKSg","BWL9E9QxvrST7y7ym","Cht4qGZ9eYAvPygNC","D3NMRJpac7Z2oFz7x","EiL6Xv4GTWGB97B8H","F3Y934eNyTeEJsg6E","FDEj5Zwdm28pFcAnB","FJdyLy2TL3v973ge8","GxccwstJJuJ4rg7Dq","H4D7r27RcPALT5DCs","HP7szFXWBWFXXZhdA","HRX7xsd7ZkTNvr67D","Hj3KN5PTNMST8hD3b","JEvEPvDBYNNXgGBnp","JYpde2ppjXLva6cre","KFgC2dZG7jXYAgZ3T","NRg9mmaSB55QqzNnH","NWCEkq6XqRBCiGmMe","NpGaG45evixRFDMiF","NyDiXeBc7cuxdWrqh","P6pva6vpPZCz6ndh9","Py2jfYGNZKNt7nxL6","Q6E9aDkYPcbhngLMx","QYrXKExv3BPABZGyA","QupQWsidagmv2nu8Z","SGZ2YignSm7njeTxy","SSuyWxzudqBDgAosw","THz3CmRmH3zZ9Xfud","TTEBJzPHwrY4d2Qfi","Wzr7kB4bxMDqceidA","YedfCw6zcDLoWAWFL","YtTEuSL9GJ8pkKcZw","Z3w2d32WjDczZMeGo","aduB2YE7dcNtbHnAN","c8gPvTXFPd9NazgEw","d6HAadRZAtz97Y2so","dTBDNYCcYKNNdhqaR","ezDt3Lb3Q6Sbo2rfX","fXtxgjbjZswBmF45i","ftBpmnKRHoB2muB8u","gKxHau44e8gnmxs6v","hM29eSWZbASnmDdFf","hw7Q4GHDAHkLTAyeB","i6Ns5rSW8R3ifxeHg","jJcoL4QWRkJQ59LfW","kKvRZ55rH7sfbubS2","kdfqsAMqCFDhpuW3S","koPTGcsAkwhGbkAYe","manxWg6Q3ZC5vW4JE","pwN2yYKo5DdSDaZGs","qpSgMrJ8WQNupjbXX","sD5Wq95oeSzqGF9kn","uSGLWGoXjyDyozeEy","wCcpScxkvg5RkcmWm","xKz7kx4eXbnkHeNXP","xeiij9YsbXBbMjciP","yGxZz3yuu6krMRxgK","yjJrpKY5QmDe8SXvm","zaR6PwJ7aC9xWBpiy"],"author_short":["Cheraghchi, M."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["Mahdi"],"propositions":[],"lastnames":["Cheraghchi"],"suffixes":[]}],"title":"Noise-resilient group testing: Limitations and constructions","booktitle":"Proceedings of the 17th International Symposium on Fundamentals of Computation Theory (FCT)","year":"2009","pages":"62–73","doi":"10.1007/978-3-642-03409-1_7","url_link":"https://link.springer.com/chapter/10.1007/978-3-642-03409-1_7","keywords":"Group testing, Randomness condensers, Extractors, List decoding","note":"Extended version in Discrete Applied Mathematics. arXiv manuscript published in 2008.","url_paper":"https://arxiv.org/abs/0811.2609","abstract":"We study combinatorial group testing schemes for learning $d$-sparse boolean vectors using highly unreliable disjunctive measurements. We consider an adversarial noise model that only limits the number of false observations, and show that any noise-resilient scheme in this model can only approximately reconstruct the sparse vector. On the positive side, we take this barrier to our advantage and show that approximate reconstruction (within a satisfactory degree of approximation) allows us to break the information theoretic lower bound of $Ω̃(d^2 łog n)$ that is known for exact reconstruction of $d$-sparse vectors of length $n$ via non-adaptive measurements, by a multiplicative factor $Ω̃(d)$. Specifically, we give simple randomized constructions of non-adaptive measurement schemes, with $m=O(d łog n)$ measurements, that allow efficient reconstruction of $d$-sparse vectors up to $O(d)$ false positives even in the presence of $δ m$ false positives and $O(m/d)$ false negatives within the measurement outcomes, for <i>any</i> constant $δ < 1$. We show that, information theoretically, none of these parameters can be substantially improved without dramatically affecting the others. Furthermore, we obtain several explicit constructions, in particular one matching the randomized trade-off but using $m = O(d^{1+o(1)} łog n)$ measurements. We also obtain explicit constructions that allow fast reconstruction in time $\\mathrm{poly}(m)$, which would be sublinear in $n$ for sufficiently sparse vectors. The main tool used in our construction is the list-decoding view of randomness condensers and extractors. An immediate consequence of our result is an adaptive scheme that runs in only two non-adaptive <i>rounds</i> and exactly reconstructs any $d$-sparse vector using a total $O(d łog n)$ measurements, a task that would be impossible in one round and fairly easy in $O(łog(n/d))$ rounds. ","bibtex":"@INPROCEEDINGS{ref:conf:Che09:testing,\n author =\t {Mahdi Cheraghchi},\n title =\t \"Noise-resilient group testing: Limitations and\n constructions\",\n booktitle =\t \"Proceedings of the 17th {International Symposium on\n Fundamentals of Computation Theory (FCT)}\",\n year =\t 2009,\n pages =\t \"62--73\",\n doi =\t\t \"10.1007/978-3-642-03409-1_7\",\n url_Link =\n \"https://link.springer.com/chapter/10.1007/978-3-642-03409-1_7\",\n keywords =\t \"Group testing, Randomness condensers, Extractors,\n List decoding\",\n note =\t {Extended version in {Discrete Applied\n Mathematics}. arXiv manuscript published in 2008.},\n url_Paper =\t {https://arxiv.org/abs/0811.2609},\n abstract =\t {We study combinatorial group testing schemes for\n learning $d$-sparse boolean vectors using highly\n unreliable disjunctive measurements. We consider an\n adversarial noise model that only limits the number\n of false observations, and show that any\n noise-resilient scheme in this model can only\n approximately reconstruct the sparse vector. On the\n positive side, we take this barrier to our advantage\n and show that approximate reconstruction (within a\n satisfactory degree of approximation) allows us to\n break the information theoretic lower bound of\n $\\tilde{\\Omega}(d^2 \\log n)$ that is known for exact\n reconstruction of $d$-sparse vectors of length $n$\n via non-adaptive measurements, by a multiplicative\n factor $\\tilde{\\Omega}(d)$. Specifically, we give\n simple randomized constructions of non-adaptive\n measurement schemes, with $m=O(d \\log n)$\n measurements, that allow efficient reconstruction of\n $d$-sparse vectors up to $O(d)$ false positives even\n in the presence of $\\delta m$ false positives and\n $O(m/d)$ false negatives within the measurement\n outcomes, for \\textit{any} constant $\\delta < 1$.\n We show that, information theoretically, none of\n these parameters can be substantially improved\n without dramatically affecting the others.\n Furthermore, we obtain several explicit\n constructions, in particular one matching the\n randomized trade-off but using $m = O(d^{1+o(1)}\n \\log n)$ measurements. We also obtain explicit\n constructions that allow fast reconstruction in time\n $\\mathrm{poly}(m)$, which would be sublinear in $n$\n for sufficiently sparse vectors. The main tool used\n in our construction is the list-decoding view of\n randomness condensers and extractors. An immediate\n consequence of our result is an adaptive scheme that\n runs in only two non-adaptive \\textit{rounds} and\n exactly reconstructs any $d$-sparse vector using a\n total $O(d \\log n)$ measurements, a task that would\n be impossible in one round and fairly easy in\n $O(\\log(n/d))$ rounds. }\n}\n\n","author_short":["Cheraghchi, M."],"key":"ref:conf:Che09:testing","id":"ref:conf:Che09:testing","bibbaseid":"cheraghchi-noiseresilientgrouptestinglimitationsandconstructions-2009","role":"author","urls":{" link":"https://link.springer.com/chapter/10.1007/978-3-642-03409-1_7"," paper":"https://arxiv.org/abs/0811.2609"},"keyword":["Group testing","Randomness condensers","Extractors","List decoding"],"metadata":{"authorlinks":{"cheraghchi, m":"https://mahdi.ch/"}}},"bibtype":"inproceedings","biburl":"http://mahdi.ch/writings/cheraghchi.bib","creationDate":"2020-05-29T01:55:30.862Z","downloads":3,"keywords":["group testing","randomness condensers","extractors","list decoding"],"search_terms":["noise","resilient","group","testing","limitations","constructions","cheraghchi"],"title":"Noise-resilient group testing: Limitations and constructions","year":2009,"dataSources":["YZqdBBx6FeYmvQE6D"]}