q-Paths: Generalizing the geometric annealing path using power means. Masrani, V., Brekelmans, R., Bui, T., Nielsen, F., Galstyan, A., Ver Steeg, G., & Wood, F. In de Campos, C. & Maathuis, M. H., editors, Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, volume 161, of Proceedings of Machine Learning Research, pages 1938–1947, 27–30 Jul, 2021. PMLR.
q-Paths: Generalizing the geometric annealing path using power means [pdf]Pdf  q-Paths: Generalizing the geometric annealing path using power means [link]Paper  q-Paths: Generalizing the geometric annealing path using power means [link]Arxiv  abstract   bibtex   
Many common machine learning methods involve the geometric annealing path, a sequence of intermediate densities between two distributions of interest constructed using the geometric average. While alternatives such as the moment-averaging path have demonstrated performance gains in some settings, their practical applicability remains limited by exponential family endpoint assumptions and a lack of closed form energy function. In this work, we introduce $q$-paths, a family of paths which is derived from a generalized notion of the mean, includes the geometric and arithmetic mixtures as special cases, and admits a simple closed form involving the deformed logarithm function from nonextensive thermodynamics. Following previous analysis of the geometric path, we interpret our $q$-paths as corresponding to a $q$-exponential family of distributions, and provide a variational representation of intermediate densities as minimizing a mixture of $α$-divergences to the endpoints. We show that small deviations away from the geometric path yield empirical gains for Bayesian inference using Sequential Monte Carlo and generative model evaluation using Annealed Importance Sampling.

Downloads: 0