Nearly Optimal Deterministic Algorithm for Sparse Walsh-Hadamard Transform. Cheraghchi, M. & Indyk, P. In Proceedings of the 27th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 298–317, 2016. Preliminary version in Proceedings of SODA 2016.Link Paper abstract bibtex For every fixed constant $α > 0$, we design an algorithm for computing the $k$-sparse Walsh-Hadamard transform (i.e., Discrete Fourier Transform over the Boolean cube) of an $N$-dimensional vector $x ∈ ℝ^N$ in time $k^{1+α} (łog N)^{O(1)}$. Specifically, the algorithm is given query access to $x$ and computes a $k$-sparse $x̃ ∈ ℝ^N$ satisfying $\|x̃ - x̂\|_1 ≤ c \|x̂ - H_k(x̂)\|_1$, for an absolute constant $c > 0$, where $x̂$ is the transform of $x$ and $H_k(x̂)$ is its best $k$-sparse approximation. Our algorithm is fully deterministic and only uses non-adaptive queries to $x$ (i.e., all queries are determined and performed in parallel when the algorithm starts). An important technical tool that we use is a construction of nearly optimal and linear lossless condensers which is a careful instantiation of the GUV condenser (Guruswami, Umans, Vadhan, JACM 2009). Moreover, we design a deterministic and non-adaptive $\ell_1/\ell_1$ compressed sensing scheme based on general lossless condensers that is equipped with a fast reconstruction algorithm running in time $k^{1+α} (łog N)^{O(1)}$ (for the GUV-based condenser) and is of independent interest. Our scheme significantly simplifies and improves an earlier expander-based construction due to Berinde, Gilbert, Indyk, Karloff, Strauss (Allerton 2008). Our methods use linear lossless condensers in a black box fashion; therefore, any future improvement on explicit constructions of such condensers would immediately translate to improved parameters in our framework (potentially leading to $k (łog N)^{O(1)}$ reconstruction time with a reduced exponent in the poly-logarithmic factor, and eliminating the extra parameter $α$). By allowing the algorithm to use randomness, while still using non-adaptive queries, the running time of the algorithm can be improved to $Õ(k łog^3 N)$.
@INPROCEEDINGS{ref:conf:CI16,
author = {Mahdi Cheraghchi and Piotr Indyk},
title = {Nearly Optimal Deterministic Algorithm for Sparse
Walsh-Hadamard Transform},
year = 2016,
booktitle = {Proceedings of the 27th {Annual ACM-SIAM Symposium
on Discrete Algorithms (SODA)}},
pages = {298--317},
numpages = 20,
keywords = {Sparse recovery, sublinear time algorithms,
pseudorandomness, explicit constructions, sketching,
sparse Fourier transform},
url_Link = {https://dl.acm.org/doi/10.5555/2884435.2884458},
abstract = {For every fixed constant $\alpha > 0$, we design an
algorithm for computing the $k$-sparse
Walsh-Hadamard transform (i.e., Discrete Fourier
Transform over the Boolean cube) of an
$N$-dimensional vector $x \in \mathbb{R}^N$ in time
$k^{1+\alpha} (\log N)^{O(1)}$. Specifically, the
algorithm is given query access to $x$ and computes
a $k$-sparse $\tilde{x} \in \mathbb{R}^N$ satisfying
$\|\tilde{x} - \hat{x}\|_1 \leq c \|\hat{x} -
H_k(\hat{x})\|_1$, for an absolute constant $c > 0$,
where $\hat{x}$ is the transform of $x$ and
$H_k(\hat{x})$ is its best $k$-sparse
approximation. Our algorithm is fully deterministic
and only uses non-adaptive queries to $x$ (i.e., all
queries are determined and performed in parallel
when the algorithm starts). An important technical
tool that we use is a construction of nearly optimal
and linear lossless condensers which is a careful
instantiation of the GUV condenser (Guruswami,
Umans, Vadhan, JACM 2009). Moreover, we design a
deterministic and non-adaptive $\ell_1/\ell_1$
compressed sensing scheme based on general lossless
condensers that is equipped with a fast
reconstruction algorithm running in time
$k^{1+\alpha} (\log N)^{O(1)}$ (for the GUV-based
condenser) and is of independent interest. Our
scheme significantly simplifies and improves an
earlier expander-based construction due to Berinde,
Gilbert, Indyk, Karloff, Strauss (Allerton 2008).
Our methods use linear lossless condensers in a
black box fashion; therefore, any future improvement
on explicit constructions of such condensers would
immediately translate to improved parameters in our
framework (potentially leading to $k (\log
N)^{O(1)}$ reconstruction time with a reduced
exponent in the poly-logarithmic factor, and
eliminating the extra parameter $\alpha$). By
allowing the algorithm to use randomness, while
still using non-adaptive queries, the running time
of the algorithm can be improved to $\tilde{O}(k
\log^3 N)$.},
note = {Preliminary version in Proceedings of {SODA 2016.}},
url_Paper = {https://eccc.weizmann.ac.il//report/2015/076/}
}
Downloads: 0
{"_id":"oJorq5mWkAoE4N2G6","bibbaseid":"cheraghchi-indyk-nearlyoptimaldeterministicalgorithmforsparsewalshhadamardtransform-2016","authorIDs":["2n8MNophuzbeevTa8","3NEcSaujokmJYSDaa","3tFWxWs2qWeYAZx9a","4QNcMTdRiWr2gs8Sk","5KoQWR3vSjnsoZNz5","5i4QHRc5LGio8Mf5u","62bYDgAFwCxaQ4Q9T","685mTysGDdQJKGxEE","6sX76eTffL7p76peN","8NLx3B3FAvaK54hSK","9NZpjMJLG7dNWroGm","9aD4MPX9ELhsyJmaR","9aFgrqcc4j28kZn8n","A9wAgP7TPK9tw28qY","BJ6h7zrsT3L89RKSg","BWL9E9QxvrST7y7ym","Cht4qGZ9eYAvPygNC","D3NMRJpac7Z2oFz7x","EiL6Xv4GTWGB97B8H","F3Y934eNyTeEJsg6E","FDEj5Zwdm28pFcAnB","FJdyLy2TL3v973ge8","GxccwstJJuJ4rg7Dq","H4D7r27RcPALT5DCs","HP7szFXWBWFXXZhdA","HRX7xsd7ZkTNvr67D","Hj3KN5PTNMST8hD3b","JEvEPvDBYNNXgGBnp","JYpde2ppjXLva6cre","KFgC2dZG7jXYAgZ3T","NRg9mmaSB55QqzNnH","NWCEkq6XqRBCiGmMe","NpGaG45evixRFDMiF","NyDiXeBc7cuxdWrqh","P6pva6vpPZCz6ndh9","Py2jfYGNZKNt7nxL6","Q6E9aDkYPcbhngLMx","QYrXKExv3BPABZGyA","QupQWsidagmv2nu8Z","SGZ2YignSm7njeTxy","SSuyWxzudqBDgAosw","THz3CmRmH3zZ9Xfud","TTEBJzPHwrY4d2Qfi","Wzr7kB4bxMDqceidA","YedfCw6zcDLoWAWFL","YtTEuSL9GJ8pkKcZw","Z3w2d32WjDczZMeGo","aduB2YE7dcNtbHnAN","c8gPvTXFPd9NazgEw","d6HAadRZAtz97Y2so","dTBDNYCcYKNNdhqaR","ezDt3Lb3Q6Sbo2rfX","fXtxgjbjZswBmF45i","ftBpmnKRHoB2muB8u","gKxHau44e8gnmxs6v","hM29eSWZbASnmDdFf","hw7Q4GHDAHkLTAyeB","i6Ns5rSW8R3ifxeHg","jJcoL4QWRkJQ59LfW","kKvRZ55rH7sfbubS2","kdfqsAMqCFDhpuW3S","koPTGcsAkwhGbkAYe","manxWg6Q3ZC5vW4JE","pwN2yYKo5DdSDaZGs","qpSgMrJ8WQNupjbXX","sD5Wq95oeSzqGF9kn","uSGLWGoXjyDyozeEy","wCcpScxkvg5RkcmWm","xKz7kx4eXbnkHeNXP","xeiij9YsbXBbMjciP","yGxZz3yuu6krMRxgK","yjJrpKY5QmDe8SXvm","zaR6PwJ7aC9xWBpiy"],"author_short":["Cheraghchi, M.","Indyk, P."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["Mahdi"],"propositions":[],"lastnames":["Cheraghchi"],"suffixes":[]},{"firstnames":["Piotr"],"propositions":[],"lastnames":["Indyk"],"suffixes":[]}],"title":"Nearly Optimal Deterministic Algorithm for Sparse Walsh-Hadamard Transform","year":"2016","booktitle":"Proceedings of the 27th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)","pages":"298–317","numpages":"20","keywords":"Sparse recovery, sublinear time algorithms, pseudorandomness, explicit constructions, sketching, sparse Fourier transform","url_link":"https://dl.acm.org/doi/10.5555/2884435.2884458","abstract":"For every fixed constant $α > 0$, we design an algorithm for computing the $k$-sparse Walsh-Hadamard transform (i.e., Discrete Fourier Transform over the Boolean cube) of an $N$-dimensional vector $x ∈ ℝ^N$ in time $k^{1+α} (łog N)^{O(1)}$. Specifically, the algorithm is given query access to $x$ and computes a $k$-sparse $x̃ ∈ ℝ^N$ satisfying $\\|x̃ - x̂\\|_1 ≤ c \\|x̂ - H_k(x̂)\\|_1$, for an absolute constant $c > 0$, where $x̂$ is the transform of $x$ and $H_k(x̂)$ is its best $k$-sparse approximation. Our algorithm is fully deterministic and only uses non-adaptive queries to $x$ (i.e., all queries are determined and performed in parallel when the algorithm starts). An important technical tool that we use is a construction of nearly optimal and linear lossless condensers which is a careful instantiation of the GUV condenser (Guruswami, Umans, Vadhan, JACM 2009). Moreover, we design a deterministic and non-adaptive $\\ell_1/\\ell_1$ compressed sensing scheme based on general lossless condensers that is equipped with a fast reconstruction algorithm running in time $k^{1+α} (łog N)^{O(1)}$ (for the GUV-based condenser) and is of independent interest. Our scheme significantly simplifies and improves an earlier expander-based construction due to Berinde, Gilbert, Indyk, Karloff, Strauss (Allerton 2008). Our methods use linear lossless condensers in a black box fashion; therefore, any future improvement on explicit constructions of such condensers would immediately translate to improved parameters in our framework (potentially leading to $k (łog N)^{O(1)}$ reconstruction time with a reduced exponent in the poly-logarithmic factor, and eliminating the extra parameter $α$). By allowing the algorithm to use randomness, while still using non-adaptive queries, the running time of the algorithm can be improved to $Õ(k łog^3 N)$.","note":"Preliminary version in Proceedings of SODA 2016.","url_paper":"https://eccc.weizmann.ac.il//report/2015/076/","bibtex":"@INPROCEEDINGS{ref:conf:CI16,\n author =\t {Mahdi Cheraghchi and Piotr Indyk},\n title =\t {Nearly Optimal Deterministic Algorithm for Sparse\n Walsh-Hadamard Transform},\n year =\t 2016,\n booktitle =\t {Proceedings of the 27th {Annual ACM-SIAM Symposium\n on Discrete Algorithms (SODA)}},\n pages =\t {298--317},\n numpages =\t 20,\n keywords =\t {Sparse recovery, sublinear time algorithms,\n pseudorandomness, explicit constructions, sketching,\n sparse Fourier transform},\n url_Link =\t {https://dl.acm.org/doi/10.5555/2884435.2884458},\n abstract =\t {For every fixed constant $\\alpha > 0$, we design an\n algorithm for computing the $k$-sparse\n Walsh-Hadamard transform (i.e., Discrete Fourier\n Transform over the Boolean cube) of an\n $N$-dimensional vector $x \\in \\mathbb{R}^N$ in time\n $k^{1+\\alpha} (\\log N)^{O(1)}$. Specifically, the\n algorithm is given query access to $x$ and computes\n a $k$-sparse $\\tilde{x} \\in \\mathbb{R}^N$ satisfying\n $\\|\\tilde{x} - \\hat{x}\\|_1 \\leq c \\|\\hat{x} -\n H_k(\\hat{x})\\|_1$, for an absolute constant $c > 0$,\n where $\\hat{x}$ is the transform of $x$ and\n $H_k(\\hat{x})$ is its best $k$-sparse\n approximation. Our algorithm is fully deterministic\n and only uses non-adaptive queries to $x$ (i.e., all\n queries are determined and performed in parallel\n when the algorithm starts). An important technical\n tool that we use is a construction of nearly optimal\n and linear lossless condensers which is a careful\n instantiation of the GUV condenser (Guruswami,\n Umans, Vadhan, JACM 2009). Moreover, we design a\n deterministic and non-adaptive $\\ell_1/\\ell_1$\n compressed sensing scheme based on general lossless\n condensers that is equipped with a fast\n reconstruction algorithm running in time\n $k^{1+\\alpha} (\\log N)^{O(1)}$ (for the GUV-based\n condenser) and is of independent interest. Our\n scheme significantly simplifies and improves an\n earlier expander-based construction due to Berinde,\n Gilbert, Indyk, Karloff, Strauss (Allerton 2008).\n Our methods use linear lossless condensers in a\n black box fashion; therefore, any future improvement\n on explicit constructions of such condensers would\n immediately translate to improved parameters in our\n framework (potentially leading to $k (\\log\n N)^{O(1)}$ reconstruction time with a reduced\n exponent in the poly-logarithmic factor, and\n eliminating the extra parameter $\\alpha$). By\n allowing the algorithm to use randomness, while\n still using non-adaptive queries, the running time\n of the algorithm can be improved to $\\tilde{O}(k\n \\log^3 N)$.},\n note =\t {Preliminary version in Proceedings of {SODA 2016.}},\n url_Paper =\t {https://eccc.weizmann.ac.il//report/2015/076/}\n}\n\n","author_short":["Cheraghchi, M.","Indyk, P."],"key":"ref:conf:CI16","id":"ref:conf:CI16","bibbaseid":"cheraghchi-indyk-nearlyoptimaldeterministicalgorithmforsparsewalshhadamardtransform-2016","role":"author","urls":{" link":"https://dl.acm.org/doi/10.5555/2884435.2884458"," paper":"https://eccc.weizmann.ac.il//report/2015/076/"},"keyword":["Sparse recovery","sublinear time algorithms","pseudorandomness","explicit constructions","sketching","sparse Fourier transform"],"metadata":{"authorlinks":{"cheraghchi, m":"https://mahdi.ch/"}}},"bibtype":"inproceedings","biburl":"http://mahdi.ch/writings/cheraghchi.bib","creationDate":"2020-05-28T23:14:56.057Z","downloads":3,"keywords":["sparse recovery","sublinear time algorithms","pseudorandomness","explicit constructions","sketching","sparse fourier transform"],"search_terms":["nearly","optimal","deterministic","algorithm","sparse","walsh","hadamard","transform","cheraghchi","indyk"],"title":"Nearly Optimal Deterministic Algorithm for Sparse Walsh-Hadamard Transform","year":2016,"dataSources":["YZqdBBx6FeYmvQE6D"]}