Improving Language Understanding with Unsupervised Learning. . Paper abstract bibtex We've obtained state-of-the-art results on a suite of diverse language tasks with a scalable, task-agnostic system, which we're also releasing. Our approach is a combination of two existing ideas: transformers and unsupervised pre-training. These results provide a convincing example that pairing supervised learning methods with unsupervised pre-training works very well;
@online{ImprovingLanguageUnderstanding2018,
title = {Improving {{Language Understanding}} with {{Unsupervised Learning}}},
url = {https://blog.openai.com/language-unsupervised/},
abstract = {We've obtained state-of-the-art results on a suite of diverse language tasks with a scalable, task-agnostic system, which we're also releasing. Our approach is a combination of two existing ideas: transformers and unsupervised pre-training. These results provide a convincing example that pairing supervised learning methods with unsupervised pre-training works very well;},
journaltitle = {OpenAI Blog},
urldate = {2018-11-01},
date = {2018-06-11T18:11:50.000Z},
file = {/home/dimitri/Nextcloud/Zotero/storage/5Q7P832V/2018 - Improving Language Understanding with Unsupervised.pdf;/home/dimitri/Nextcloud/Zotero/storage/LSPHJAFJ/language-unsupervised.html}
}
Downloads: 0
{"_id":"gLs4PJyq3JiYcujTs","bibbaseid":"anonymous-improvinglanguageunderstandingwithunsupervisedlearning","authorIDs":[],"bibdata":{"bibtype":"online","type":"online","title":"Improving Language Understanding with Unsupervised Learning","url":"https://blog.openai.com/language-unsupervised/","abstract":"We've obtained state-of-the-art results on a suite of diverse language tasks with a scalable, task-agnostic system, which we're also releasing. Our approach is a combination of two existing ideas: transformers and unsupervised pre-training. These results provide a convincing example that pairing supervised learning methods with unsupervised pre-training works very well;","journaltitle":"OpenAI Blog","urldate":"2018-11-01","date":"2018-06-11T18:11:50.000Z","file":"/home/dimitri/Nextcloud/Zotero/storage/5Q7P832V/2018 - Improving Language Understanding with Unsupervised.pdf;/home/dimitri/Nextcloud/Zotero/storage/LSPHJAFJ/language-unsupervised.html","bibtex":"@online{ImprovingLanguageUnderstanding2018,\n title = {Improving {{Language Understanding}} with {{Unsupervised Learning}}},\n url = {https://blog.openai.com/language-unsupervised/},\n abstract = {We've obtained state-of-the-art results on a suite of diverse language tasks with a scalable, task-agnostic system, which we're also releasing. Our approach is a combination of two existing ideas: transformers and unsupervised pre-training. These results provide a convincing example that pairing supervised learning methods with unsupervised pre-training works very well;},\n journaltitle = {OpenAI Blog},\n urldate = {2018-11-01},\n date = {2018-06-11T18:11:50.000Z},\n file = {/home/dimitri/Nextcloud/Zotero/storage/5Q7P832V/2018 - Improving Language Understanding with Unsupervised.pdf;/home/dimitri/Nextcloud/Zotero/storage/LSPHJAFJ/language-unsupervised.html}\n}\n\n","key":"ImprovingLanguageUnderstanding2018","id":"ImprovingLanguageUnderstanding2018","bibbaseid":"anonymous-improvinglanguageunderstandingwithunsupervisedlearning","urls":{"Paper":"https://blog.openai.com/language-unsupervised/"},"downloads":0},"bibtype":"online","biburl":"https://raw.githubusercontent.com/dlozeve/newblog/master/bib/all.bib","creationDate":"2020-01-08T20:39:39.185Z","downloads":0,"keywords":[],"search_terms":["improving","language","understanding","unsupervised","learning"],"title":"Improving Language Understanding with Unsupervised Learning","year":null,"dataSources":["3XqdvqRE7zuX4cm8m"]}