Efficient Inference Amortization in Graphical Models using Structured Continuous Conditional Normalizing Flows. Weilbach, C., Beronov, B., Harvey, W., & Wood, F. In 2nd Symposium on Advances in Approximate Bayesian Inference (AABI), 2019. Link Paper abstract bibtex We introduce a more efficient neural architecture for amortized inference, which combines continuous and conditional normalizing flows using a principled choice of structure. Our gradient flow derives its sparsity pattern from the minimally faithful inverse of its underlying graphical model. We find that this factorization reduces the necessary numbers both of parameters in the neural network and of adaptive integration steps in the ODE solver. Consequently, the throughput at training time and inference time is increased, without decreasing performance in comparison to unconstrained flows. By expressing the structural inversion and the flow construction as compilation passes of a probabilistic programming language, we demonstrate their applicability to the stochastic inversion of realistic models such as convolutional neural networks (CNN).
@inproceedings{WEI-19,
title={Efficient Inference Amortization in Graphical Models using Structured Continuous Conditional Normalizing Flows},
author={Weilbach, Christian and Beronov, Boyan and Harvey, William and Wood, Frank},
booktitle={2nd Symposium on Advances in Approximate Bayesian Inference (AABI)},
support = {D3M},
url_Link={https://openreview.net/forum?id=BJlhYknNFS},
url_Paper={https://openreview.net/pdf?id=BJlhYknNFS},
abstract = {We introduce a more efficient neural architecture for amortized inference, which combines continuous and conditional normalizing flows using a principled choice of structure. Our gradient flow derives its sparsity pattern from the minimally faithful inverse of its underlying graphical model. We find that this factorization reduces the necessary numbers both of parameters in the neural network and of adaptive integration steps in the ODE solver. Consequently, the throughput at training time and inference time is increased, without decreasing performance in comparison to unconstrained flows. By expressing the structural inversion and the flow construction as compilation passes of a probabilistic programming language, we demonstrate their applicability to the stochastic inversion of realistic models such as convolutional neural networks (CNN).},
year={2019}
}
Downloads: 0
{"_id":"JZstf4LA4KvJHRA4N","bibbaseid":"weilbach-beronov-harvey-wood-efficientinferenceamortizationingraphicalmodelsusingstructuredcontinuousconditionalnormalizingflows-2019","authorIDs":[],"author_short":["Weilbach, C.","Beronov, B.","Harvey, W.","Wood, F."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","title":"Efficient Inference Amortization in Graphical Models using Structured Continuous Conditional Normalizing Flows","author":[{"propositions":[],"lastnames":["Weilbach"],"firstnames":["Christian"],"suffixes":[]},{"propositions":[],"lastnames":["Beronov"],"firstnames":["Boyan"],"suffixes":[]},{"propositions":[],"lastnames":["Harvey"],"firstnames":["William"],"suffixes":[]},{"propositions":[],"lastnames":["Wood"],"firstnames":["Frank"],"suffixes":[]}],"booktitle":"2nd Symposium on Advances in Approximate Bayesian Inference (AABI)","support":"D3M","url_link":"https://openreview.net/forum?id=BJlhYknNFS","url_paper":"https://openreview.net/pdf?id=BJlhYknNFS","abstract":"We introduce a more efficient neural architecture for amortized inference, which combines continuous and conditional normalizing flows using a principled choice of structure. Our gradient flow derives its sparsity pattern from the minimally faithful inverse of its underlying graphical model. We find that this factorization reduces the necessary numbers both of parameters in the neural network and of adaptive integration steps in the ODE solver. Consequently, the throughput at training time and inference time is increased, without decreasing performance in comparison to unconstrained flows. By expressing the structural inversion and the flow construction as compilation passes of a probabilistic programming language, we demonstrate their applicability to the stochastic inversion of realistic models such as convolutional neural networks (CNN).","year":"2019","bibtex":"@inproceedings{WEI-19,\n title={Efficient Inference Amortization in Graphical Models using Structured Continuous Conditional Normalizing Flows},\n author={Weilbach, Christian and Beronov, Boyan and Harvey, William and Wood, Frank},\n booktitle={2nd Symposium on Advances in Approximate Bayesian Inference (AABI)},\n support = {D3M},\n url_Link={https://openreview.net/forum?id=BJlhYknNFS},\n url_Paper={https://openreview.net/pdf?id=BJlhYknNFS},\n abstract = {We introduce a more efficient neural architecture for amortized inference, which combines continuous and conditional normalizing flows using a principled choice of structure. Our gradient flow derives its sparsity pattern from the minimally faithful inverse of its underlying graphical model. We find that this factorization reduces the necessary numbers both of parameters in the neural network and of adaptive integration steps in the ODE solver. Consequently, the throughput at training time and inference time is increased, without decreasing performance in comparison to unconstrained flows. By expressing the structural inversion and the flow construction as compilation passes of a probabilistic programming language, we demonstrate their applicability to the stochastic inversion of realistic models such as convolutional neural networks (CNN).},\n year={2019}\n}\n\n","author_short":["Weilbach, C.","Beronov, B.","Harvey, W.","Wood, F."],"key":"WEI-19","id":"WEI-19","bibbaseid":"weilbach-beronov-harvey-wood-efficientinferenceamortizationingraphicalmodelsusingstructuredcontinuousconditionalnormalizingflows-2019","role":"author","urls":{" link":"https://openreview.net/forum?id=BJlhYknNFS"," paper":"https://openreview.net/pdf?id=BJlhYknNFS"},"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"inproceedings","biburl":"https://raw.githubusercontent.com/plai-group/bibliography/master/group_publications.bib","creationDate":"2020-01-27T02:13:33.776Z","downloads":0,"keywords":[],"search_terms":["efficient","inference","amortization","graphical","models","using","structured","continuous","conditional","normalizing","flows","weilbach","beronov","harvey","wood"],"title":"Efficient Inference Amortization in Graphical Models using Structured Continuous Conditional Normalizing Flows","year":2019,"dataSources":["7avRLRrz2ifJGMKcD","BKH7YtW7K7WNMA3cj","wyN5DxtoT6AQuiXnm"]}