Deep physical neural networks trained with backpropagation. Wright, L. G., Onodera, T., Stein, M. M., Wang, T., Schachter, D. T., Hu, Z., & McMahon, P. L. Nature, 601(7894):549–555, January, 2022. 11 citations (Semantic Scholar/DOI) [2022-05-17]
Paper doi abstract bibtex Abstract Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability 1 . Deep-learning accelerators 2–9 aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far 10–22 have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics 23–26 , materials 27–29 and smart sensors 30–32 .
@article{wright_deep_2022,
title = {Deep physical neural networks trained with backpropagation},
volume = {601},
issn = {0028-0836, 1476-4687},
url = {https://www.nature.com/articles/s41586-021-04223-6},
doi = {10.1038/s41586-021-04223-6},
abstract = {Abstract
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability
1
. Deep-learning accelerators
2–9
aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far
10–22
have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics
23–26
, materials
27–29
and smart sensors
30–32
.},
language = {en},
number = {7894},
urldate = {2022-05-09},
journal = {Nature},
author = {Wright, Logan G. and Onodera, Tatsuhiro and Stein, Martin M. and Wang, Tianyu and Schachter, Darren T. and Hu, Zoey and McMahon, Peter L.},
month = jan,
year = {2022},
note = {11 citations (Semantic Scholar/DOI) [2022-05-17]},
pages = {549--555},
}
Downloads: 0
{"_id":"gcu47Wq5sdarMPx4n","bibbaseid":"wright-onodera-stein-wang-schachter-hu-mcmahon-deepphysicalneuralnetworkstrainedwithbackpropagation-2022","author_short":["Wright, L. G.","Onodera, T.","Stein, M. M.","Wang, T.","Schachter, D. T.","Hu, Z.","McMahon, P. L."],"bibdata":{"bibtype":"article","type":"article","title":"Deep physical neural networks trained with backpropagation","volume":"601","issn":"0028-0836, 1476-4687","url":"https://www.nature.com/articles/s41586-021-04223-6","doi":"10.1038/s41586-021-04223-6","abstract":"Abstract Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability 1 . Deep-learning accelerators 2–9 aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far 10–22 have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics 23–26 , materials 27–29 and smart sensors 30–32 .","language":"en","number":"7894","urldate":"2022-05-09","journal":"Nature","author":[{"propositions":[],"lastnames":["Wright"],"firstnames":["Logan","G."],"suffixes":[]},{"propositions":[],"lastnames":["Onodera"],"firstnames":["Tatsuhiro"],"suffixes":[]},{"propositions":[],"lastnames":["Stein"],"firstnames":["Martin","M."],"suffixes":[]},{"propositions":[],"lastnames":["Wang"],"firstnames":["Tianyu"],"suffixes":[]},{"propositions":[],"lastnames":["Schachter"],"firstnames":["Darren","T."],"suffixes":[]},{"propositions":[],"lastnames":["Hu"],"firstnames":["Zoey"],"suffixes":[]},{"propositions":[],"lastnames":["McMahon"],"firstnames":["Peter","L."],"suffixes":[]}],"month":"January","year":"2022","note":"11 citations (Semantic Scholar/DOI) [2022-05-17]","pages":"549–555","bibtex":"@article{wright_deep_2022,\n\ttitle = {Deep physical neural networks trained with backpropagation},\n\tvolume = {601},\n\tissn = {0028-0836, 1476-4687},\n\turl = {https://www.nature.com/articles/s41586-021-04223-6},\n\tdoi = {10.1038/s41586-021-04223-6},\n\tabstract = {Abstract\n \n Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability\n 1\n . Deep-learning accelerators\n 2–9\n aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far\n 10–22\n have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ–in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics\n 23–26\n , materials\n 27–29\n and smart sensors\n 30–32\n .},\n\tlanguage = {en},\n\tnumber = {7894},\n\turldate = {2022-05-09},\n\tjournal = {Nature},\n\tauthor = {Wright, Logan G. and Onodera, Tatsuhiro and Stein, Martin M. and Wang, Tianyu and Schachter, Darren T. and Hu, Zoey and McMahon, Peter L.},\n\tmonth = jan,\n\tyear = {2022},\n\tnote = {11 citations (Semantic Scholar/DOI) [2022-05-17]},\n\tpages = {549--555},\n}\n\n\n\n","author_short":["Wright, L. G.","Onodera, T.","Stein, M. M.","Wang, T.","Schachter, D. T.","Hu, Z.","McMahon, P. L."],"key":"wright_deep_2022","id":"wright_deep_2022","bibbaseid":"wright-onodera-stein-wang-schachter-hu-mcmahon-deepphysicalneuralnetworkstrainedwithbackpropagation-2022","role":"author","urls":{"Paper":"https://www.nature.com/articles/s41586-021-04223-6"},"metadata":{"authorlinks":{}},"downloads":0,"html":""},"bibtype":"article","biburl":"https://bibbase.org/zotero/qiuyuanwang","dataSources":["wWPhSRj9hrZuqsm9D"],"keywords":[],"search_terms":["deep","physical","neural","networks","trained","backpropagation","wright","onodera","stein","wang","schachter","hu","mcmahon"],"title":"Deep physical neural networks trained with backpropagation","year":2022}