Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations. Araujo, V., Villa, A., Mendoza, M., Moens, M., & Soto, A. In EMNLP, 2021.
Paper abstract bibtex 1 download Current language models are usually trained using a self-supervised scheme, where the main focus is learning representations at the word or sentence level. However, there has been limited progress in generating useful discourse-level representations. In this work, we propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a result, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.
@inproceedings{VladiEtAl:ACL:2021,
author = {V. Araujo and A. Villa and M. Mendoza and M. Moens and A. Soto},
title = {Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations},
booktitle = {{EMNLP}},
year = {2021},
abstract = {Current language models are usually trained using a self-supervised scheme, where the
main focus is learning representations at the word or sentence level. However, there has
been limited progress in generating useful discourse-level representations. In this work,
we propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a
result, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.},
url = {https://arxiv.org/pdf/2109.04602.pdf},
}
Downloads: 1
{"_id":"Be4X8Lp65mp8dEJCW","bibbaseid":"araujo-villa-mendoza-moens-soto-augmentingbertstylemodelswithpredictivecodingtoimprovediscourselevelrepresentations-2021","author_short":["Araujo, V.","Villa, A.","Mendoza, M.","Moens, M.","Soto, A."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["V."],"propositions":[],"lastnames":["Araujo"],"suffixes":[]},{"firstnames":["A."],"propositions":[],"lastnames":["Villa"],"suffixes":[]},{"firstnames":["M."],"propositions":[],"lastnames":["Mendoza"],"suffixes":[]},{"firstnames":["M."],"propositions":[],"lastnames":["Moens"],"suffixes":[]},{"firstnames":["A."],"propositions":[],"lastnames":["Soto"],"suffixes":[]}],"title":"Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations","booktitle":"EMNLP","year":"2021","abstract":"Current language models are usually trained using a self-supervised scheme, where the main focus is learning representations at the word or sentence level. However, there has been limited progress in generating useful discourse-level representations. In this work, we propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a result, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.","url":"https://arxiv.org/pdf/2109.04602.pdf","bibtex":"@inproceedings{VladiEtAl:ACL:2021,\n author = {V. Araujo and A. Villa and M. Mendoza and M. Moens and A. Soto},\n title = {Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations},\n booktitle = {{EMNLP}},\n year = {2021},\n abstract = {Current language models are usually trained using a self-supervised scheme, where the\nmain focus is learning representations at the word or sentence level. However, there has\nbeen limited progress in generating useful discourse-level representations. In this work,\nwe propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a\nresult, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.},\nurl = {https://arxiv.org/pdf/2109.04602.pdf},\n}\n\n\n","author_short":["Araujo, V.","Villa, A.","Mendoza, M.","Moens, M.","Soto, A."],"key":"VladiEtAl:ACL:2021","id":"VladiEtAl:ACL:2021","bibbaseid":"araujo-villa-mendoza-moens-soto-augmentingbertstylemodelswithpredictivecodingtoimprovediscourselevelrepresentations-2021","role":"author","urls":{"Paper":"https://arxiv.org/pdf/2109.04602.pdf"},"metadata":{"authorlinks":{}},"downloads":1,"html":""},"bibtype":"inproceedings","biburl":"https://asoto.ing.puc.cl/AlvaroPapers.bib","dataSources":["QjT2DEZoWmQYxjHXS"],"keywords":[],"search_terms":["augmenting","bert","style","models","predictive","coding","improve","discourse","level","representations","araujo","villa","mendoza","moens","soto"],"title":"Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations","year":2021,"downloads":1}