Data Encoding for Byzantine-Resilient Distributed Optimization. Data, D., Song, L., & Diggavi, S. N. IEEE Transactions on Information Theory, 67(2):1117-1140, Feb, 2021.
Arxiv doi abstract bibtex 3 downloads We study distributed optimization in the presence of Byzantine adversaries, where both data and computation are distributed among m worker machines, t of which may be corrupt. The compromised nodes may collaboratively and arbitrarily deviate from their pre-specified programs, and a designated (master) node iteratively computes the model/parameter vector for generalized linear models. In this work, we primarily focus on two iterative algorithms: Proximal Gradient Descent (PGD) and Coordinate Descent (CD). Gradient descent (GD) is a special case of these algorithms. PGD is typically used in the data-parallel setting, where data is partitioned across different samples, whereas, CD is used in the model-parallelism setting, where data is partitioned across the parameter space. In this paper, we propose a method based on data encoding and error correction over real numbers to combat adversarial attacks. We can tolerate up to t≤⌊m−12⌋ corrupt worker nodes, which is information-theoretically optimal. We give deterministic guarantees, and our method does not assume any probability distribution on the data. We develop a \em sparse encoding scheme which enables computationally efficient data encoding and decoding. We demonstrate a trade-off between the corruption threshold and the resource requirements (storage, computational, and communication complexity). As an example, for t≤m3, our scheme incurs only a \em constant overhead on these resources, over that required by the plain distributed PGD/CD algorithms which provide no adversarial protection. To the best of our knowledge, ours is the first paper that makes CD secure against adversarial attacks. Our encoding scheme extends efficiently to the data streaming model and for stochastic gradient descent (SGD). We also give experimental results to show the efficacy of our proposed schemes.
@article{DBLP:journals/corr/abs-1907-02664,
abstract = {We study distributed optimization in the presence of Byzantine adversaries, where both data and computation are distributed among m worker machines, t of which may be corrupt. The compromised nodes may collaboratively and arbitrarily deviate from their pre-specified programs, and a designated (master) node iteratively computes the model/parameter vector for generalized linear models. In this work, we primarily focus on two iterative algorithms: Proximal Gradient Descent (PGD) and Coordinate Descent (CD). Gradient descent (GD) is a special case of these algorithms. PGD is typically used in the data-parallel setting, where data is partitioned across different samples, whereas, CD is used in the model-parallelism setting, where data is partitioned across the parameter space. In this paper, we propose a method based on data encoding and error correction over real numbers to combat adversarial attacks. We can tolerate up to t≤⌊m−12⌋ corrupt worker nodes, which is information-theoretically optimal. We give deterministic guarantees, and our method does not assume any probability distribution on the data. We develop a {\em sparse} encoding scheme which enables computationally efficient data encoding and decoding. We demonstrate a trade-off between the corruption threshold and the resource requirements (storage, computational, and communication complexity). As an example, for t≤m3, our scheme incurs only a {\em constant} overhead on these resources, over that required by the plain distributed PGD/CD algorithms which provide no adversarial protection. To the best of our knowledge, ours is the first paper that makes CD secure against adversarial attacks. Our encoding scheme extends efficiently to the data streaming model and for stochastic gradient descent (SGD). We also give experimental results to show the efficacy of our proposed schemes.},
author = {Deepesh Data and
Linqi Song and
Suhas N. Diggavi},
eprint = {1907.02664},
journal = {IEEE Transactions on Information Theory},
tags = {journal,SDL,DML},
title = {Data Encoding for Byzantine-Resilient Distributed Optimization},
type = {2},
url_arxiv = {http://arxiv.org/abs/1907.02664},
volume = {},
year = {2021},
volume={67},
number={2},
pages={1117-1140},
doi = {10.1109/TIT.2020.3035868},
ISSN={1557-9654},
month={Feb},
}
Downloads: 3
{"_id":"Y3XxqDAhAN2tzSAng","bibbaseid":"data-song-diggavi-dataencodingforbyzantineresilientdistributedoptimization-2021","author_short":["Data, D.","Song, L.","Diggavi, S. N."],"bibdata":{"bibtype":"article","type":"2","abstract":"We study distributed optimization in the presence of Byzantine adversaries, where both data and computation are distributed among m worker machines, t of which may be corrupt. The compromised nodes may collaboratively and arbitrarily deviate from their pre-specified programs, and a designated (master) node iteratively computes the model/parameter vector for generalized linear models. In this work, we primarily focus on two iterative algorithms: Proximal Gradient Descent (PGD) and Coordinate Descent (CD). Gradient descent (GD) is a special case of these algorithms. PGD is typically used in the data-parallel setting, where data is partitioned across different samples, whereas, CD is used in the model-parallelism setting, where data is partitioned across the parameter space. In this paper, we propose a method based on data encoding and error correction over real numbers to combat adversarial attacks. We can tolerate up to t≤⌊m−12⌋ corrupt worker nodes, which is information-theoretically optimal. We give deterministic guarantees, and our method does not assume any probability distribution on the data. We develop a \\em sparse encoding scheme which enables computationally efficient data encoding and decoding. We demonstrate a trade-off between the corruption threshold and the resource requirements (storage, computational, and communication complexity). As an example, for t≤m3, our scheme incurs only a \\em constant overhead on these resources, over that required by the plain distributed PGD/CD algorithms which provide no adversarial protection. To the best of our knowledge, ours is the first paper that makes CD secure against adversarial attacks. Our encoding scheme extends efficiently to the data streaming model and for stochastic gradient descent (SGD). We also give experimental results to show the efficacy of our proposed schemes.","author":[{"firstnames":["Deepesh"],"propositions":[],"lastnames":["Data"],"suffixes":[]},{"firstnames":["Linqi"],"propositions":[],"lastnames":["Song"],"suffixes":[]},{"firstnames":["Suhas","N."],"propositions":[],"lastnames":["Diggavi"],"suffixes":[]}],"eprint":"1907.02664","journal":"IEEE Transactions on Information Theory","tags":"journal,SDL,DML","title":"Data Encoding for Byzantine-Resilient Distributed Optimization","url_arxiv":"http://arxiv.org/abs/1907.02664","volume":"67","year":"2021","number":"2","pages":"1117-1140","doi":"10.1109/TIT.2020.3035868","issn":"1557-9654","month":"Feb","bibtex":"@article{DBLP:journals/corr/abs-1907-02664,\n abstract = {We study distributed optimization in the presence of Byzantine adversaries, where both data and computation are distributed among m worker machines, t of which may be corrupt. The compromised nodes may collaboratively and arbitrarily deviate from their pre-specified programs, and a designated (master) node iteratively computes the model/parameter vector for generalized linear models. In this work, we primarily focus on two iterative algorithms: Proximal Gradient Descent (PGD) and Coordinate Descent (CD). Gradient descent (GD) is a special case of these algorithms. PGD is typically used in the data-parallel setting, where data is partitioned across different samples, whereas, CD is used in the model-parallelism setting, where data is partitioned across the parameter space. In this paper, we propose a method based on data encoding and error correction over real numbers to combat adversarial attacks. We can tolerate up to t≤⌊m−12⌋ corrupt worker nodes, which is information-theoretically optimal. We give deterministic guarantees, and our method does not assume any probability distribution on the data. We develop a {\\em sparse} encoding scheme which enables computationally efficient data encoding and decoding. We demonstrate a trade-off between the corruption threshold and the resource requirements (storage, computational, and communication complexity). As an example, for t≤m3, our scheme incurs only a {\\em constant} overhead on these resources, over that required by the plain distributed PGD/CD algorithms which provide no adversarial protection. To the best of our knowledge, ours is the first paper that makes CD secure against adversarial attacks. Our encoding scheme extends efficiently to the data streaming model and for stochastic gradient descent (SGD). We also give experimental results to show the efficacy of our proposed schemes.},\n author = {Deepesh Data and\nLinqi Song and\nSuhas N. Diggavi},\n eprint = {1907.02664},\n journal = {IEEE Transactions on Information Theory},\n tags = {journal,SDL,DML},\n title = {Data Encoding for Byzantine-Resilient Distributed Optimization},\n type = {2},\n url_arxiv = {http://arxiv.org/abs/1907.02664},\n volume = {},\n year = {2021},\n volume={67},\n number={2},\n pages={1117-1140},\n doi = {10.1109/TIT.2020.3035868},\n ISSN={1557-9654},\n month={Feb},\n}\n\n","author_short":["Data, D.","Song, L.","Diggavi, S. N."],"key":"DBLP:journals/corr/abs-1907-02664","id":"DBLP:journals/corr/abs-1907-02664","bibbaseid":"data-song-diggavi-dataencodingforbyzantineresilientdistributedoptimization-2021","role":"author","urls":{" arxiv":"http://arxiv.org/abs/1907.02664"},"metadata":{"authorlinks":{}},"downloads":3,"html":""},"bibtype":"article","biburl":"https://bibbase.org/network/files/e2kjGxYgtBo8SWSbC","dataSources":["hicKnsKYNEFXC4CgH","jxCYzXXYRqw2fiEXQ","wCByFFrQMyRwfzrJ6","yuqM5ah4HMsTyDrMa","YaM87hGQiepg5qijZ","n9wmfkt5w8CPqCepg","soj2cS6PgG8NPmWGr","FaDBDiyFAJY5pL28h","ycfdiwWPzC2rE6H77"],"keywords":[],"search_terms":["data","encoding","byzantine","resilient","distributed","optimization","data","song","diggavi"],"title":"Data Encoding for Byzantine-Resilient Distributed Optimization","year":2021,"downloads":3}