Lower Bounds for Locally Private Estimation via Communication Complexity. Duchi, J. & Rogers, R. In 32nd Conference on Learning Theory, COLT 2019, volume 99, of Proceedings of Machine Learning Research, pages 1161–1191, 2019. PMLR.
[DR19] Develops a communication-complexity-based methodology for lower bounds under LDP constraints in the blackboard communication model. Obtains lower bounds for mean estimation of Bernoulli products under general $\ell_p$ losses, as well as tight bounds for mean estimation of Gaussian and sparse Gaussian distributions under the $\ell_2$ loss.
bibtex   
@inproceedings{DR19,
  title = 	 {Lower Bounds for Locally Private Estimation via Communication Complexity},
  author = 	 {Duchi, John and Rogers, Ryan},
  booktitle = 	 {32nd Conference on Learning Theory, {COLT} 2019},
  pages = 	 {1161--1191},
  year = 	 {2019},
  volume = 	 {99},
  series = 	 {Proceedings of Machine Learning Research},
  publisher = 	 {PMLR},
  bibbase_note = {<div class="well well-small bibbase"><span class="bluecite">[DR19]</span> Develops a communication-complexity-based methodology for lower bounds under LDP constraints in the blackboard communication model. Obtains lower bounds for mean estimation of Bernoulli products under general $\ell_p$ losses, as well as tight bounds for mean estimation of Gaussian and sparse Gaussian distributions under the $\ell_2$ loss.}
 }

Downloads: 0