{"_id":"LS6fdMvrYNrMBksfB","bibbaseid":"wolf-debut-sanh-chaumond-delangue-moi-cistac-rault-etal-huggingfacestransformersstateoftheartnaturallanguageprocessing-2019","authorIDs":[],"author_short":["Wolf, T.","Debut, L.","Sanh, V.","Chaumond, J.","Delangue, C.","Moi, A.","Cistac, P.","Rault, T.","Louf, R.","Funtowicz, M.","Brew, J."],"bibdata":{"bibtype":"article","type":"article","abstract":"Recent advances in modern Natural Language Processing (NLP) research have been dominated by the combination of Transfer Learning methods with large-scale language models, in particular based on the Transformer architecture. With them came a paradigm shift in NLP with the starting point for training a model on a downstream task moving from a blank specific model to a general-purpose pretrained architecture. Still, creating these general-purpose models remains an expensive and time-consuming process restricting the use of these methods to a small sub-set of the wider NLP community. In this paper, we present HuggingFace's Transformers library, a library for state-of-the-art NLP, making these developments available to the community by gathering state-of-the-art general-purpose pretrained models under a unified API together with an ecosystem of libraries, examples, tutorials and scripts targeting many downstream NLP tasks. HuggingFace's Transformers library features carefully crafted model implementations and high-performance pretrained weights for two main deep learning frameworks, PyTorch and TensorFlow, while supporting all the necessary tools to analyze, evaluate and use these models in downstream tasks such as text/token classification, questions answering and language generation among others. The library has gained significant organic traction and adoption among both the researcher and practitioner communities. We are committed at HuggingFace to pursue the efforts to develop this toolkit with the ambition of creating the standard library for building NLP systems.","archiveprefix":"arXiv","arxivid":"1910.03771","author":[{"propositions":[],"lastnames":["Wolf"],"firstnames":["Thomas"],"suffixes":[]},{"propositions":[],"lastnames":["Debut"],"firstnames":["Lysandre"],"suffixes":[]},{"propositions":[],"lastnames":["Sanh"],"firstnames":["Victor"],"suffixes":[]},{"propositions":[],"lastnames":["Chaumond"],"firstnames":["Julien"],"suffixes":[]},{"propositions":[],"lastnames":["Delangue"],"firstnames":["Clement"],"suffixes":[]},{"propositions":[],"lastnames":["Moi"],"firstnames":["Anthony"],"suffixes":[]},{"propositions":[],"lastnames":["Cistac"],"firstnames":["Pierric"],"suffixes":[]},{"propositions":[],"lastnames":["Rault"],"firstnames":["Tim"],"suffixes":[]},{"propositions":[],"lastnames":["Louf"],"firstnames":["Rémi"],"suffixes":[]},{"propositions":[],"lastnames":["Funtowicz"],"firstnames":["Morgan"],"suffixes":[]},{"propositions":[],"lastnames":["Brew"],"firstnames":["Jamie"],"suffixes":[]}],"eprint":"1910.03771","file":":Users/shanest/Documents/Library/Wolf et al/Unknown/Wolf et al. - 2019 - HuggingFace's Transformers State-of-the-art Natural Language Processing.pdf:pdf","keywords":"dataset,model","month":"oct","title":"HuggingFace's Transformers: State-of-the-art Natural Language Processing","url":"http://arxiv.org/abs/1910.03771","year":"2019","bibtex":"@article{Wolf2019,\nabstract = {Recent advances in modern Natural Language Processing (NLP) research have been dominated by the combination of Transfer Learning methods with large-scale language models, in particular based on the Transformer architecture. With them came a paradigm shift in NLP with the starting point for training a model on a downstream task moving from a blank specific model to a general-purpose pretrained architecture. Still, creating these general-purpose models remains an expensive and time-consuming process restricting the use of these methods to a small sub-set of the wider NLP community. In this paper, we present HuggingFace's Transformers library, a library for state-of-the-art NLP, making these developments available to the community by gathering state-of-the-art general-purpose pretrained models under a unified API together with an ecosystem of libraries, examples, tutorials and scripts targeting many downstream NLP tasks. HuggingFace's Transformers library features carefully crafted model implementations and high-performance pretrained weights for two main deep learning frameworks, PyTorch and TensorFlow, while supporting all the necessary tools to analyze, evaluate and use these models in downstream tasks such as text/token classification, questions answering and language generation among others. The library has gained significant organic traction and adoption among both the researcher and practitioner communities. We are committed at HuggingFace to pursue the efforts to develop this toolkit with the ambition of creating the standard library for building NLP systems.},\narchivePrefix = {arXiv},\narxivId = {1910.03771},\nauthor = {Wolf, Thomas and Debut, Lysandre and Sanh, Victor and Chaumond, Julien and Delangue, Clement and Moi, Anthony and Cistac, Pierric and Rault, Tim and Louf, R{\\'{e}}mi and Funtowicz, Morgan and Brew, Jamie},\neprint = {1910.03771},\nfile = {:Users/shanest/Documents/Library/Wolf et al/Unknown/Wolf et al. - 2019 - HuggingFace's Transformers State-of-the-art Natural Language Processing.pdf:pdf},\nkeywords = {dataset,model},\nmonth = {oct},\ntitle = {{HuggingFace's Transformers: State-of-the-art Natural Language Processing}},\nurl = {http://arxiv.org/abs/1910.03771},\nyear = {2019}\n}\n","author_short":["Wolf, T.","Debut, L.","Sanh, V.","Chaumond, J.","Delangue, C.","Moi, A.","Cistac, P.","Rault, T.","Louf, R.","Funtowicz, M.","Brew, J."],"key":"Wolf2019","id":"Wolf2019","bibbaseid":"wolf-debut-sanh-chaumond-delangue-moi-cistac-rault-etal-huggingfacestransformersstateoftheartnaturallanguageprocessing-2019","role":"author","urls":{"Paper":"http://arxiv.org/abs/1910.03771"},"keyword":["dataset","model"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"article","biburl":"https://www.shane.st/teaching/575/win20/MachineLearning-interpretability.bib","creationDate":"2020-01-06T20:30:40.658Z","downloads":0,"keywords":["dataset","model"],"search_terms":["huggingface","transformers","state","art","natural","language","processing","wolf","debut","sanh","chaumond","delangue","moi","cistac","rault","louf","funtowicz","brew"],"title":"HuggingFace's Transformers: State-of-the-art Natural Language Processing","year":2019,"dataSources":["okYcdTpf4JJ2zkj7A","znj7izS5PeehdLR3G"]}