Restricted Isometry of Fourier Matrices and List Decodability of Random Linear Codes. Cheraghchi, M., Guruswami, V., & Velingker, A. In Proceedings of the 24th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 432–442, 2013. Extended version in SIAM Journal on Computing (SICOMP).
Link
Paper abstract bibtex 14 downloads We prove that a random linear code over $\mathbb{F}_q$, with probability arbitrarily close to $1$, is list decodable at radius $1-1/q-ε$ with list size $L=O(1/ε^2)$ and rate $R=Ω_q(ε^2/(łog^3(1/ε)))$. Up to the polylogarithmic factor in $1/ε$ and constant factors depending on $q$, this matches the lower bound $L=Ω_q(1/ε^2)$ for the list size and upper bound $R=O_q(ε^2)$ for the rate. Previously only existence (and not abundance) of such codes was known for the special case $q=2$ (Guruswami, Håstad, Sudan and Zuckerman, 2002). In order to obtain our result, we employ a relaxed version of the well known Johnson bound on list decoding that translates the average Hamming distance between codewords to list decoding guarantees. We furthermore prove that the desired average-distance guarantees hold for a code provided that a natural complex matrix encoding the codewords satisfies the Restricted Isometry Property with respect to the Euclidean norm (RIP-2). For the case of random binary linear codes, this matrix coincides with a random submatrix of the Hadamard-Walsh transform matrix that is well studied in the compressed sensing literature. Finally, we improve the analysis of Rudelson and Vershynin (2008) on the number of random frequency samples required for exact reconstruction of $k$-sparse signals of length $N$. Specifically, we improve the number of samples from $O(k łog(N) łog^2(k) (łog k + łogłog N))$ to $O(k łog(N) · łog^3(k))$. The proof involves bounding the expected supremum of a related Gaussian process by using an improved analysis of the metric defined by the process. This improvement is crucial for our application in list decoding.
@INPROCEEDINGS{ref:conf:CGV13,
author = {Cheraghchi, Mahdi and Guruswami, Venkatesan and
Velingker, Ameya},
title = {Restricted Isometry of {Fourier} Matrices and List
Decodability of Random Linear Codes},
year = 2013,
booktitle = {Proceedings of the 24th {Annual ACM-SIAM Symposium
on Discrete Algorithms (SODA)}},
pages = {432--442},
numpages = 11,
keywords = {combinatorial list decoding, compressed sensing,
{Gaussian} processes},
url_Link = {https://dl.acm.org/doi/10.5555/2627817.2627848},
abstract = {We prove that a random linear code over
$\mathbb{F}_q$, with probability arbitrarily close
to $1$, is list decodable at radius $1-1/q-\epsilon$
with list size $L=O(1/\epsilon^2)$ and rate
$R=\Omega_q(\epsilon^2/(\log^3(1/\epsilon)))$. Up to
the polylogarithmic factor in $1/\epsilon$ and
constant factors depending on $q$, this matches the
lower bound $L=\Omega_q(1/\epsilon^2)$ for the list
size and upper bound $R=O_q(\epsilon^2)$ for the
rate. Previously only existence (and not abundance)
of such codes was known for the special case $q=2$
(Guruswami, H{\aa}stad, Sudan and Zuckerman, 2002).
In order to obtain our result, we employ a relaxed
version of the well known Johnson bound on list
decoding that translates the \textit{average}
Hamming distance between codewords to list decoding
guarantees. We furthermore prove that the desired
average-distance guarantees hold for a code provided
that a natural complex matrix encoding the codewords
satisfies the Restricted Isometry Property with
respect to the Euclidean norm (RIP-2). For the case
of random binary linear codes, this matrix coincides
with a random submatrix of the Hadamard-Walsh
transform matrix that is well studied in the
compressed sensing literature. Finally, we improve
the analysis of Rudelson and Vershynin (2008) on the
number of random frequency samples required for
exact reconstruction of $k$-sparse signals of length
$N$. Specifically, we improve the number of samples
from $O(k \log(N) \log^2(k) (\log k + \log\log N))$
to $O(k \log(N) \cdot \log^3(k))$. The proof
involves bounding the expected supremum of a related
Gaussian process by using an improved analysis of
the metric defined by the process. This improvement
is crucial for our application in list decoding.},
note = {Extended version in {SIAM Journal on Computing
(SICOMP).}},
url_Paper = {https://eccc.weizmann.ac.il//report/2012/082}
}
Downloads: 14
{"_id":"Fon3utZTSCoCi3Lw7","bibbaseid":"cheraghchi-guruswami-velingker-restrictedisometryoffouriermatricesandlistdecodabilityofrandomlinearcodes-2013","authorIDs":["2n8MNophuzbeevTa8","3NEcSaujokmJYSDaa","3tFWxWs2qWeYAZx9a","4QNcMTdRiWr2gs8Sk","5KoQWR3vSjnsoZNz5","5i4QHRc5LGio8Mf5u","62bYDgAFwCxaQ4Q9T","685mTysGDdQJKGxEE","6sX76eTffL7p76peN","8NLx3B3FAvaK54hSK","9NZpjMJLG7dNWroGm","9aD4MPX9ELhsyJmaR","9aFgrqcc4j28kZn8n","A9wAgP7TPK9tw28qY","BJ6h7zrsT3L89RKSg","BWL9E9QxvrST7y7ym","Cht4qGZ9eYAvPygNC","D3NMRJpac7Z2oFz7x","EiL6Xv4GTWGB97B8H","F3Y934eNyTeEJsg6E","FDEj5Zwdm28pFcAnB","FJdyLy2TL3v973ge8","GxccwstJJuJ4rg7Dq","H4D7r27RcPALT5DCs","HP7szFXWBWFXXZhdA","HRX7xsd7ZkTNvr67D","Hj3KN5PTNMST8hD3b","JEvEPvDBYNNXgGBnp","JYpde2ppjXLva6cre","KFgC2dZG7jXYAgZ3T","NRg9mmaSB55QqzNnH","NWCEkq6XqRBCiGmMe","NpGaG45evixRFDMiF","NyDiXeBc7cuxdWrqh","P6pva6vpPZCz6ndh9","Py2jfYGNZKNt7nxL6","Q6E9aDkYPcbhngLMx","QYrXKExv3BPABZGyA","QupQWsidagmv2nu8Z","SGZ2YignSm7njeTxy","SSuyWxzudqBDgAosw","THz3CmRmH3zZ9Xfud","TTEBJzPHwrY4d2Qfi","Wzr7kB4bxMDqceidA","YedfCw6zcDLoWAWFL","YtTEuSL9GJ8pkKcZw","Z3w2d32WjDczZMeGo","aduB2YE7dcNtbHnAN","c8gPvTXFPd9NazgEw","d6HAadRZAtz97Y2so","dTBDNYCcYKNNdhqaR","ezDt3Lb3Q6Sbo2rfX","fXtxgjbjZswBmF45i","ftBpmnKRHoB2muB8u","gKxHau44e8gnmxs6v","hM29eSWZbASnmDdFf","hw7Q4GHDAHkLTAyeB","i6Ns5rSW8R3ifxeHg","jJcoL4QWRkJQ59LfW","kKvRZ55rH7sfbubS2","kdfqsAMqCFDhpuW3S","koPTGcsAkwhGbkAYe","manxWg6Q3ZC5vW4JE","pwN2yYKo5DdSDaZGs","qpSgMrJ8WQNupjbXX","sD5Wq95oeSzqGF9kn","uSGLWGoXjyDyozeEy","wCcpScxkvg5RkcmWm","xKz7kx4eXbnkHeNXP","xeiij9YsbXBbMjciP","yGxZz3yuu6krMRxgK","yjJrpKY5QmDe8SXvm","zaR6PwJ7aC9xWBpiy"],"author_short":["Cheraghchi, M.","Guruswami, V.","Velingker, A."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"propositions":[],"lastnames":["Cheraghchi"],"firstnames":["Mahdi"],"suffixes":[]},{"propositions":[],"lastnames":["Guruswami"],"firstnames":["Venkatesan"],"suffixes":[]},{"propositions":[],"lastnames":["Velingker"],"firstnames":["Ameya"],"suffixes":[]}],"title":"Restricted Isometry of Fourier Matrices and List Decodability of Random Linear Codes","year":"2013","booktitle":"Proceedings of the 24th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)","pages":"432–442","numpages":"11","keywords":"combinatorial list decoding, compressed sensing, Gaussian processes","url_link":"https://dl.acm.org/doi/10.5555/2627817.2627848","abstract":"We prove that a random linear code over $\\mathbb{F}_q$, with probability arbitrarily close to $1$, is list decodable at radius $1-1/q-ε$ with list size $L=O(1/ε^2)$ and rate $R=Ω_q(ε^2/(łog^3(1/ε)))$. Up to the polylogarithmic factor in $1/ε$ and constant factors depending on $q$, this matches the lower bound $L=Ω_q(1/ε^2)$ for the list size and upper bound $R=O_q(ε^2)$ for the rate. Previously only existence (and not abundance) of such codes was known for the special case $q=2$ (Guruswami, Håstad, Sudan and Zuckerman, 2002). In order to obtain our result, we employ a relaxed version of the well known Johnson bound on list decoding that translates the <i>average</i> Hamming distance between codewords to list decoding guarantees. We furthermore prove that the desired average-distance guarantees hold for a code provided that a natural complex matrix encoding the codewords satisfies the Restricted Isometry Property with respect to the Euclidean norm (RIP-2). For the case of random binary linear codes, this matrix coincides with a random submatrix of the Hadamard-Walsh transform matrix that is well studied in the compressed sensing literature. Finally, we improve the analysis of Rudelson and Vershynin (2008) on the number of random frequency samples required for exact reconstruction of $k$-sparse signals of length $N$. Specifically, we improve the number of samples from $O(k łog(N) łog^2(k) (łog k + łogłog N))$ to $O(k łog(N) · łog^3(k))$. The proof involves bounding the expected supremum of a related Gaussian process by using an improved analysis of the metric defined by the process. This improvement is crucial for our application in list decoding.","note":"Extended version in SIAM Journal on Computing (SICOMP).","url_paper":"https://eccc.weizmann.ac.il//report/2012/082","bibtex":"@INPROCEEDINGS{ref:conf:CGV13,\n author =\t {Cheraghchi, Mahdi and Guruswami, Venkatesan and\n Velingker, Ameya},\n title =\t {Restricted Isometry of {Fourier} Matrices and List\n Decodability of Random Linear Codes},\n year =\t 2013,\n booktitle =\t {Proceedings of the 24th {Annual ACM-SIAM Symposium\n on Discrete Algorithms (SODA)}},\n pages =\t {432--442},\n numpages =\t 11,\n keywords =\t {combinatorial list decoding, compressed sensing,\n {Gaussian} processes},\n url_Link =\t {https://dl.acm.org/doi/10.5555/2627817.2627848},\n abstract =\t {We prove that a random linear code over\n $\\mathbb{F}_q$, with probability arbitrarily close\n to $1$, is list decodable at radius $1-1/q-\\epsilon$\n with list size $L=O(1/\\epsilon^2)$ and rate\n $R=\\Omega_q(\\epsilon^2/(\\log^3(1/\\epsilon)))$. Up to\n the polylogarithmic factor in $1/\\epsilon$ and\n constant factors depending on $q$, this matches the\n lower bound $L=\\Omega_q(1/\\epsilon^2)$ for the list\n size and upper bound $R=O_q(\\epsilon^2)$ for the\n rate. Previously only existence (and not abundance)\n of such codes was known for the special case $q=2$\n (Guruswami, H{\\aa}stad, Sudan and Zuckerman, 2002).\n In order to obtain our result, we employ a relaxed\n version of the well known Johnson bound on list\n decoding that translates the \\textit{average}\n Hamming distance between codewords to list decoding\n guarantees. We furthermore prove that the desired\n average-distance guarantees hold for a code provided\n that a natural complex matrix encoding the codewords\n satisfies the Restricted Isometry Property with\n respect to the Euclidean norm (RIP-2). For the case\n of random binary linear codes, this matrix coincides\n with a random submatrix of the Hadamard-Walsh\n transform matrix that is well studied in the\n compressed sensing literature. Finally, we improve\n the analysis of Rudelson and Vershynin (2008) on the\n number of random frequency samples required for\n exact reconstruction of $k$-sparse signals of length\n $N$. Specifically, we improve the number of samples\n from $O(k \\log(N) \\log^2(k) (\\log k + \\log\\log N))$\n to $O(k \\log(N) \\cdot \\log^3(k))$. The proof\n involves bounding the expected supremum of a related\n Gaussian process by using an improved analysis of\n the metric defined by the process. This improvement\n is crucial for our application in list decoding.},\n note =\t {Extended version in {SIAM Journal on Computing\n (SICOMP).}},\n url_Paper =\t {https://eccc.weizmann.ac.il//report/2012/082}\n}\n\n","author_short":["Cheraghchi, M.","Guruswami, V.","Velingker, A."],"key":"ref:conf:CGV13","id":"ref:conf:CGV13","bibbaseid":"cheraghchi-guruswami-velingker-restrictedisometryoffouriermatricesandlistdecodabilityofrandomlinearcodes-2013","role":"author","urls":{" link":"https://dl.acm.org/doi/10.5555/2627817.2627848"," paper":"https://eccc.weizmann.ac.il//report/2012/082"},"keyword":["combinatorial list decoding","compressed sensing","Gaussian processes"],"metadata":{"authorlinks":{"cheraghchi, m":"https://mahdi.ch/writings/"}},"downloads":14},"bibtype":"inproceedings","biburl":"http://mahdi.ch/writings/cheraghchi.bib","creationDate":"2020-05-29T00:07:23.303Z","downloads":15,"keywords":["combinatorial list decoding","compressed sensing","gaussian processes"],"search_terms":["restricted","isometry","fourier","matrices","list","decodability","random","linear","codes","cheraghchi","guruswami","velingker"],"title":"Restricted Isometry of Fourier Matrices and List Decodability of Random Linear Codes","year":2013,"dataSources":["YZqdBBx6FeYmvQE6D","A3rW4SyEF9gWcyMcH"]}