Transformer-based deep learning for predicting protein properties in the life sciences. Chandra, A., Tünnermann, L., Löfstedt, T., & Gratz, R. eLife, 12:e82819, January, 2023. Paper doi abstract bibtex Recent developments in deep learning, coupled with an increasing number of sequenced proteins, have led to a breakthrough in life science applications, in particular in protein property prediction. There is hope that deep learning can close the gap between the number of sequenced proteins and proteins with known properties based on lab experiments. Language models from the field of natural language processing have gained popularity for protein property predictions and have led to a new computational revolution in biology, where old prediction results are being improved regularly. Such models can learn useful multipurpose representations of proteins from large open repositories of protein sequences and can be used, for instance, to predict protein properties. The field of natural language processing is growing quickly because of developments in a class of models based on a particular model—the Transformer model. We review recent developments and the use of large-scale Transformer models in applications for predicting protein characteristics and how such models can be used to predict, for example, post-translational modifications. We review shortcomings of other deep learning models and explain how the Transformer models have quickly proven to be a very promising way to unravel information hidden in the sequences of amino acids.
@article{chandra_transformer-based_2023,
title = {Transformer-based deep learning for predicting protein properties in the life sciences},
volume = {12},
issn = {2050-084X},
url = {https://doi.org/10.7554/eLife.82819},
doi = {10.7554/eLife.82819},
abstract = {Recent developments in deep learning, coupled with an increasing number of sequenced proteins, have led to a breakthrough in life science applications, in particular in protein property prediction. There is hope that deep learning can close the gap between the number of sequenced proteins and proteins with known properties based on lab experiments. Language models from the field of natural language processing have gained popularity for protein property predictions and have led to a new computational revolution in biology, where old prediction results are being improved regularly. Such models can learn useful multipurpose representations of proteins from large open repositories of protein sequences and can be used, for instance, to predict protein properties. The field of natural language processing is growing quickly because of developments in a class of models based on a particular model—the Transformer model. We review recent developments and the use of large-scale Transformer models in applications for predicting protein characteristics and how such models can be used to predict, for example, post-translational modifications. We review shortcomings of other deep learning models and explain how the Transformer models have quickly proven to be a very promising way to unravel information hidden in the sequences of amino acids.},
urldate = {2023-01-20},
journal = {eLife},
author = {Chandra, Abel and Tünnermann, Laura and Löfstedt, Tommy and Gratz, Regina},
editor = {Dötsch, Volker},
month = jan,
year = {2023},
keywords = {deep learning, life sciences, machine learning, protein property prediction, transformers},
pages = {e82819},
}
Downloads: 0
{"_id":"TvyDqMxfv9QfriRYW","bibbaseid":"chandra-tnnermann-lfstedt-gratz-transformerbaseddeeplearningforpredictingproteinpropertiesinthelifesciences-2023","author_short":["Chandra, A.","Tünnermann, L.","Löfstedt, T.","Gratz, R."],"bibdata":{"bibtype":"article","type":"article","title":"Transformer-based deep learning for predicting protein properties in the life sciences","volume":"12","issn":"2050-084X","url":"https://doi.org/10.7554/eLife.82819","doi":"10.7554/eLife.82819","abstract":"Recent developments in deep learning, coupled with an increasing number of sequenced proteins, have led to a breakthrough in life science applications, in particular in protein property prediction. There is hope that deep learning can close the gap between the number of sequenced proteins and proteins with known properties based on lab experiments. Language models from the field of natural language processing have gained popularity for protein property predictions and have led to a new computational revolution in biology, where old prediction results are being improved regularly. Such models can learn useful multipurpose representations of proteins from large open repositories of protein sequences and can be used, for instance, to predict protein properties. The field of natural language processing is growing quickly because of developments in a class of models based on a particular model—the Transformer model. We review recent developments and the use of large-scale Transformer models in applications for predicting protein characteristics and how such models can be used to predict, for example, post-translational modifications. We review shortcomings of other deep learning models and explain how the Transformer models have quickly proven to be a very promising way to unravel information hidden in the sequences of amino acids.","urldate":"2023-01-20","journal":"eLife","author":[{"propositions":[],"lastnames":["Chandra"],"firstnames":["Abel"],"suffixes":[]},{"propositions":[],"lastnames":["Tünnermann"],"firstnames":["Laura"],"suffixes":[]},{"propositions":[],"lastnames":["Löfstedt"],"firstnames":["Tommy"],"suffixes":[]},{"propositions":[],"lastnames":["Gratz"],"firstnames":["Regina"],"suffixes":[]}],"editor":[{"propositions":[],"lastnames":["Dötsch"],"firstnames":["Volker"],"suffixes":[]}],"month":"January","year":"2023","keywords":"deep learning, life sciences, machine learning, protein property prediction, transformers","pages":"e82819","bibtex":"@article{chandra_transformer-based_2023,\n\ttitle = {Transformer-based deep learning for predicting protein properties in the life sciences},\n\tvolume = {12},\n\tissn = {2050-084X},\n\turl = {https://doi.org/10.7554/eLife.82819},\n\tdoi = {10.7554/eLife.82819},\n\tabstract = {Recent developments in deep learning, coupled with an increasing number of sequenced proteins, have led to a breakthrough in life science applications, in particular in protein property prediction. There is hope that deep learning can close the gap between the number of sequenced proteins and proteins with known properties based on lab experiments. Language models from the field of natural language processing have gained popularity for protein property predictions and have led to a new computational revolution in biology, where old prediction results are being improved regularly. Such models can learn useful multipurpose representations of proteins from large open repositories of protein sequences and can be used, for instance, to predict protein properties. The field of natural language processing is growing quickly because of developments in a class of models based on a particular model—the Transformer model. We review recent developments and the use of large-scale Transformer models in applications for predicting protein characteristics and how such models can be used to predict, for example, post-translational modifications. We review shortcomings of other deep learning models and explain how the Transformer models have quickly proven to be a very promising way to unravel information hidden in the sequences of amino acids.},\n\turldate = {2023-01-20},\n\tjournal = {eLife},\n\tauthor = {Chandra, Abel and Tünnermann, Laura and Löfstedt, Tommy and Gratz, Regina},\n\teditor = {Dötsch, Volker},\n\tmonth = jan,\n\tyear = {2023},\n\tkeywords = {deep learning, life sciences, machine learning, protein property prediction, transformers},\n\tpages = {e82819},\n}\n\n\n\n","author_short":["Chandra, A.","Tünnermann, L.","Löfstedt, T.","Gratz, R."],"editor_short":["Dötsch, V."],"key":"chandra_transformer-based_2023","id":"chandra_transformer-based_2023","bibbaseid":"chandra-tnnermann-lfstedt-gratz-transformerbaseddeeplearningforpredictingproteinpropertiesinthelifesciences-2023","role":"author","urls":{"Paper":"https://doi.org/10.7554/eLife.82819"},"keyword":["deep learning","life sciences","machine learning","protein property prediction","transformers"],"metadata":{"authorlinks":{}}},"bibtype":"article","biburl":"https://bibbase.org/zotero/upscpub","dataSources":["3zTPPmKj8BiTcpc6C","9cGcv2t8pRzC92kzs"],"keywords":["deep learning","life sciences","machine learning","protein property prediction","transformers"],"search_terms":["transformer","based","deep","learning","predicting","protein","properties","life","sciences","chandra","tünnermann","löfstedt","gratz"],"title":"Transformer-based deep learning for predicting protein properties in the life sciences","year":2023}