Learning Deep Representations by Mutual Information Estimation and Maximization. Hjelm, R. D., Fedorov, A., Lavoie-Marchildon, S., Grewal, K., Bachman, P., Trischler, A., & Bengio, Y.
Learning Deep Representations by Mutual Information Estimation and Maximization [link]Paper  abstract   bibtex   
In this work, we perform unsupervised learning of representations by maximizing mutual information between an input and the output of a deep neural network encoder. Importantly, we show that structure matters: incorporating knowledge about locality of the input to the objective can greatly influence a representation's suitability for downstream tasks. We further control characteristics of the representation by matching to a prior distribution adversarially. Our method, which we call Deep InfoMax (DIM), outperforms a number of popular unsupervised learning methods and competes with fully-supervised learning on several classification tasks. DIM opens new avenues for unsupervised learning of representations and is an important step towards flexible formulations of representation-learning objectives for specific end-goals.
@article{hjelmLearningDeepRepresentations2018,
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1808.06670},
  primaryClass = {cs, stat},
  title = {Learning Deep Representations by Mutual Information Estimation and Maximization},
  url = {http://arxiv.org/abs/1808.06670},
  abstract = {In this work, we perform unsupervised learning of representations by maximizing mutual information between an input and the output of a deep neural network encoder. Importantly, we show that structure matters: incorporating knowledge about locality of the input to the objective can greatly influence a representation's suitability for downstream tasks. We further control characteristics of the representation by matching to a prior distribution adversarially. Our method, which we call Deep InfoMax (DIM), outperforms a number of popular unsupervised learning methods and competes with fully-supervised learning on several classification tasks. DIM opens new avenues for unsupervised learning of representations and is an important step towards flexible formulations of representation-learning objectives for specific end-goals.},
  urldate = {2019-04-01},
  date = {2018-08-20},
  keywords = {Statistics - Machine Learning,Computer Science - Machine Learning},
  author = {Hjelm, R. Devon and Fedorov, Alex and Lavoie-Marchildon, Samuel and Grewal, Karan and Bachman, Phil and Trischler, Adam and Bengio, Yoshua},
  file = {/home/dimitri/Nextcloud/Zotero/storage/36AV9H8R/Hjelm et al. - 2018 - Learning deep representations by mutual informatio.pdf;/home/dimitri/Nextcloud/Zotero/storage/IXX3GXTN/1808.html}
}

Downloads: 0