Uncertainty in Neural Processes. Naderiparizi, S., Chiu, K., Bloem-Reddy, B., & Wood, F. 2020.
Uncertainty in Neural Processes [link]Arxiv  Uncertainty in Neural Processes [pdf]Paper  abstract   bibtex   1 download  
We explore the effects of architecture and training objective choice on amortized posterior predictive inference in probabilistic conditional generative models. We aim this work to be a counterpoint to a recent trend in the literature that stresses achieving good samples when the amount of conditioning data is large. We instead focus our attention on the case where the amount of conditioning data is small. We highlight specific architecture and objective choices that we find lead to qualitative and quantitative improvement to posterior inference in this low data regime. Specifically we explore the effects of choices of pooling operator and variational family on posterior quality in neural processes. Superior posterior predictive samples drawn from our novel neural process architectures are demonstrated via image completion/in-painting experiments.

Downloads: 1