Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis. Xu, D., Laparra, E., & Bethard, S. In Proceedings of the Eighth Joint Conference on Lexical and Computational Semantics (*SEM 2019), pages 68-74, Minneapolis, Minnesota, 6, 2019. Association for Computational Linguistics.
Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis [link]Paper  bibtex   
@InProceedings{xu-laparra-bethard:2019:S19-1,
  author    = {Xu, Dongfang  and  Laparra, Egoitz  and  Bethard, Steven},
  title     = {Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis},
  booktitle = {Proceedings of the Eighth Joint Conference on Lexical and Computational Semantics (*SEM 2019)},
  month     = {6},
  year      = {2019},
  address   = {Minneapolis, Minnesota},
  publisher = {Association for Computational Linguistics},
  pages     = {68-74},
  url       = {http://www.aclweb.org/anthology/S19-1008},
  keywords = {timelines, information extraction},
}
Downloads: 0