Probabilistic Symmetry and Invariant Neural Networks. Bloem-Reddy, B. & Teh, Y. W.
Probabilistic Symmetry and Invariant Neural Networks [link]Paper  abstract   bibtex   
In an effort to improve the performance of deep neural networks in data-scarce, non-i.i.d., or unsupervised settings, much recent research has been devoted to encoding invariance under symmetry transformations into neural network architectures. We treat the neural network input and output as random variables, and consider group invariance from the perspective of probabilistic symmetry. Drawing on tools from probability and statistics, we establish a link between functional and probabilistic symmetry, and obtain generative functional representations of joint and conditional probability distributions that are invariant or equivariant under the action of a compact group. Those representations completely characterize the structure of neural networks that can be used to model such distributions and yield a general program for constructing invariant stochastic or deterministic neural networks. We develop the details of the general program for exchangeable sequences and arrays, recovering a number of recent examples as special cases.
@article{bloem-reddyProbabilisticSymmetryInvariant2019,
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1901.06082},
  primaryClass = {cs, stat},
  title = {Probabilistic Symmetry and Invariant Neural Networks},
  url = {http://arxiv.org/abs/1901.06082},
  abstract = {In an effort to improve the performance of deep neural networks in data-scarce, non-i.i.d., or unsupervised settings, much recent research has been devoted to encoding invariance under symmetry transformations into neural network architectures. We treat the neural network input and output as random variables, and consider group invariance from the perspective of probabilistic symmetry. Drawing on tools from probability and statistics, we establish a link between functional and probabilistic symmetry, and obtain generative functional representations of joint and conditional probability distributions that are invariant or equivariant under the action of a compact group. Those representations completely characterize the structure of neural networks that can be used to model such distributions and yield a general program for constructing invariant stochastic or deterministic neural networks. We develop the details of the general program for exchangeable sequences and arrays, recovering a number of recent examples as special cases.},
  urldate = {2019-01-25},
  date = {2019-01-17},
  keywords = {Statistics - Machine Learning,Computer Science - Machine Learning},
  author = {Bloem-Reddy, Benjamin and Teh, Yee Whye},
  file = {/home/dimitri/Nextcloud/Zotero/storage/86G4UXEK/Bloem-Reddy and Teh - 2019 - Probabilistic symmetry and invariant neural networ.pdf;/home/dimitri/Nextcloud/Zotero/storage/GV3VFG7K/1901.html}
}

Downloads: 0