The Differential Entropy of Mixtures: New Bounds and Applications. Melbourne, J., Talukdar, S., Bhaban, S., Madiman, M., & Salapaka, M., V. 5, 2018.
The Differential Entropy of Mixtures: New Bounds and Applications [pdf]Paper  The Differential Entropy of Mixtures: New Bounds and Applications [link]Website  doi  abstract   bibtex   
Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of probability distributions is of fundamental importance in many of these applications. This article provides sharp bounds on the entropy concavity deficit, which is the difference between the entropy of the mixture and the weighted sum of entropies of constituent components. Toward establishing lower and upper bounds on the concavity deficit, results that are of importance in their own right are obtained. In order to obtain nontrivial upper bounds, properties of the skew-divergence are developed and notions of "skew" $f$-divergences are introduced; a reverse Pinsker inequality and a bound on Jensen-Shannon divergence are obtained along the way. Complementary lower bounds are derived with special attention paid to the case that corresponds to independent summation of a continuous and a discrete random variable. Several applications of the bounds are delineated, including to mutual information of additive noise channels, thermodynamics of computation, and functional inequalities.

Downloads: 0