Bayesian Entropy Estimation for Countable Discrete Distributions. Archer, E., Park, I. M., & Pillow, J. ArXiv e-prints, February, 2013.
Bayesian Entropy Estimation for Countable Discrete Distributions [link]Paper  abstract   bibtex   
We consider the problem of estimating Shannon's entropy $H$ from discrete data, in cases where the number of possible symbols is unknown or even countably infinite. The Pitman-Yor process, a generalization of Dirichlet process, provides a tractable prior distribution over the space of countably infinite discrete distributions, and has found major applications in Bayesian non-parametric statistics and machine learning. Here we show that it also provides a natural family of priors for Bayesian entropy estimation, due to the fact that moments of the induced posterior distribution over $H$ can be computed analytically. We derive formulas for the posterior mean (Bayes' least squares estimate) and variance under Dirichlet and Pitman-Yor process priors. Moreover, we show that a fixed Dirichlet or Pitman-Yor process prior implies a narrow prior distribution over $H$, meaning the prior strongly determines the entropy estimate in the under-sampled regime. We derive a family of continuous mixing measures such that the resulting mixture of Pitman-Yor processes produces an approximately flat prior over $H$. We show that the resulting Pitman-Yor Mixture (PYM) entropy estimator is consistent for a large class of distributions. We explore the theoretical properties of the resulting estimator, and show that it performs well both in simulation and in application to real data.
@ARTICLE{Archer2013a,
  author = {Archer, Evan and Park, Il Memming and Pillow, Jonathan},
  title = {{Bayes}ian Entropy Estimation for Countable Discrete Distributions},
  journal = {ArXiv e-prints},
  year = {2013},
  month = feb,
  abstract = {We consider the problem of estimating Shannon's entropy $H$ from discrete
	data, in cases where the number of possible symbols is unknown or
	even countably infinite. The {Pitman-Yor} process, a generalization
	of Dirichlet process, provides a tractable prior distribution over
	the space of countably infinite discrete distributions, and has found
	major applications in Bayesian non-parametric statistics and machine
	learning. Here we show that it also provides a natural family of
	priors for Bayesian entropy estimation, due to the fact that moments
	of the induced posterior distribution over $H$ can be computed analytically.
	We derive formulas for the posterior mean (Bayes' least squares estimate)
	and variance under Dirichlet and {Pitman-Yor} process priors. Moreover,
	we show that a fixed Dirichlet or {Pitman-Yor} process prior implies
	a narrow prior distribution over $H$, meaning the prior strongly
	determines the entropy estimate in the under-sampled regime. We derive
	a family of continuous mixing measures such that the resulting mixture
	of {Pitman-Yor} processes produces an approximately flat prior over
	$H$. We show that the resulting {Pitman-Yor} Mixture ({PYM}) entropy
	estimator is consistent for a large class of distributions. We explore
	the theoretical properties of the resulting estimator, and show that
	it performs well both in simulation and in application to real data.},
  archiveprefix = {arXiv},
  citeulike-article-id = {12071222},
  citeulike-linkout-0 = {http://arxiv.org/abs/1302.0328},
  citeulike-linkout-1 = {http://arxiv.org/pdf/1302.0328},
  day = {2},
  eprint = {1302.0328},
  keywords = {bayesian, entropy-estimation, nonparametric-bayes, pitman-yor-process},
  posted-at = {2013-02-25 03:18:15},
  primaryclass = {cs.IT},
  priority = {0},
  url = {http://arxiv.org/abs/1302.0328}
}

Downloads: 0