Recurrent Neural Network Grammars. Dyer, C., Kuncoro, A., Ballesteros, M., & Smith, N. A.
Recurrent Neural Network Grammars [link]Paper  abstract   bibtex   
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese.
@article{dyerRecurrentNeuralNetwork2016,
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1602.07776},
  primaryClass = {cs},
  title = {Recurrent {{Neural Network Grammars}}},
  url = {http://arxiv.org/abs/1602.07776},
  abstract = {We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese.},
  urldate = {2019-06-03},
  date = {2016-02-24},
  keywords = {Computer Science - Computation and Language,Computer Science - Neural and Evolutionary Computing},
  author = {Dyer, Chris and Kuncoro, Adhiguna and Ballesteros, Miguel and Smith, Noah A.},
  file = {/home/dimitri/Nextcloud/Zotero/storage/WQVMS2ZL/Dyer et al. - 2016 - Recurrent Neural Network Grammars.pdf;/home/dimitri/Nextcloud/Zotero/storage/G457GBIL/1602.html}
}

Downloads: 0