EntityBERT: Entity-centric Masking Strategy for Model Pretraining for the Clinical Domain. Lin, C., Miller, T., Dligach, D., Bethard, S., & Savova, G. In Proceedings of the 20th Workshop on Biomedical Language Processing, pages 191–201, Online, June, 2021. Association for Computational Linguistics.
EntityBERT: Entity-centric Masking Strategy for Model Pretraining for the Clinical Domain [link]Paper  bibtex   8 downloads  
@inproceedings{lin-etal-2021-entitybert,
    title = "{E}ntity{BERT}: Entity-centric Masking Strategy for Model Pretraining for the Clinical Domain",
    author = "Lin, Chen  and
      Miller, Timothy  and
      Dligach, Dmitriy  and
      Bethard, Steven  and
      Savova, Guergana",
    booktitle = "Proceedings of the 20th Workshop on Biomedical Language Processing",
    month = jun,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2021.bionlp-1.21",
    pages = "191--201",
    keywords = {timelines, information extraction, health applications, workshop paper},
}

Downloads: 8