How does BERT's attention change when you fine-tune? An analysis methodology and a case study in negation scope. Zhao, Y. & Bethard, S. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4729-4747, Online, July, 2020. Association for Computational Linguistics.
How does BERT's attention change when you fine-tune? An analysis methodology and a case study in negation scope [link]Paper  bibtex   
@inproceedings{zhao-bethard-2020-berts,
    title = "How does {BERT}{'}s attention change when you fine-tune? An analysis methodology and a case study in negation scope",
    author = "Zhao, Yiyun  and
      Bethard, Steven",
    booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
    month = jul,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.acl-main.429",
    pages = "4729-4747",
    keywords = {negation, machine learning},
}
Downloads: 0