Generalization in Deep Learning. Kawaguchi, K., Kaelbling, L. P., & Bengio, Y.
Generalization in Deep Learning [link]Paper  abstract   bibtex   
This paper provides non-vacuous and numerically-tight generalization guarantees for deep learning, as well as theoretical insights into why and how deep learning can generalize well, despite its large capacity, complexity, possible algorithmic instability, nonrobustness, and sharp minima, responding to an open question in the literature. We also propose new open problems and discuss the limitations of our results.
@article{kawaguchiGeneralizationDeepLearning2017,
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1710.05468},
  primaryClass = {cs, stat},
  title = {Generalization in {{Deep Learning}}},
  url = {http://arxiv.org/abs/1710.05468},
  abstract = {This paper provides non-vacuous and numerically-tight generalization guarantees for deep learning, as well as theoretical insights into why and how deep learning can generalize well, despite its large capacity, complexity, possible algorithmic instability, nonrobustness, and sharp minima, responding to an open question in the literature. We also propose new open problems and discuss the limitations of our results.},
  urldate = {2019-05-14},
  date = {2017-10-15},
  keywords = {Statistics - Machine Learning,Computer Science - Artificial Intelligence,Computer Science - Machine Learning,Computer Science - Neural and Evolutionary Computing},
  author = {Kawaguchi, Kenji and Kaelbling, Leslie Pack and Bengio, Yoshua},
  file = {/home/dimitri/Nextcloud/Zotero/storage/IGRS7AC2/Kawaguchi et al. - 2017 - Generalization in Deep Learning.pdf;/home/dimitri/Nextcloud/Zotero/storage/TXV7FXKZ/1710.html}
}
Downloads: 0