On quantum backpropagation, information reuse, and cheating measurement collapse. Abbas, A., King, R., Huang, H., Huggins, W. J., Movassagh, R., Gilboa, D., & McClean, J. R. May, 2023. arXiv:2305.13362 [quant-ph]
Paper abstract bibtex The success of modern deep learning hinges on the ability to train neural networks at scale. Through clever reuse of intermediate information, backpropagation facilitates training through gradient computation at a total cost roughly proportional to running the function, rather than incurring an additional factor proportional to the number of parameters – which can now be in the trillions. Naively, one expects that quantum measurement collapse entirely rules out the reuse of quantum information as in backpropagation. But recent developments in shadow tomography, which assumes access to multiple copies of a quantum state, have challenged that notion. Here, we investigate whether parameterized quantum models can train as efficiently as classical neural networks. We show that achieving backpropagation scaling is impossible without access to multiple copies of a state. With this added ability, we introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources while reducing classical auxiliary computational costs to open problems in shadow tomography. These results highlight the nuance of reusing quantum information for practical purposes and clarify the unique difficulties in training large quantum models, which could alter the course of quantum machine learning.
@misc{abbas_quantum_2023,
title = {On quantum backpropagation, information reuse, and cheating measurement collapse},
url = {http://arxiv.org/abs/2305.13362},
abstract = {The success of modern deep learning hinges on the ability to train neural networks at scale. Through clever reuse of intermediate information, backpropagation facilitates training through gradient computation at a total cost roughly proportional to running the function, rather than incurring an additional factor proportional to the number of parameters – which can now be in the trillions. Naively, one expects that quantum measurement collapse entirely rules out the reuse of quantum information as in backpropagation. But recent developments in shadow tomography, which assumes access to multiple copies of a quantum state, have challenged that notion. Here, we investigate whether parameterized quantum models can train as efficiently as classical neural networks. We show that achieving backpropagation scaling is impossible without access to multiple copies of a state. With this added ability, we introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources while reducing classical auxiliary computational costs to open problems in shadow tomography. These results highlight the nuance of reusing quantum information for practical purposes and clarify the unique difficulties in training large quantum models, which could alter the course of quantum machine learning.},
language = {en},
urldate = {2023-06-27},
publisher = {arXiv},
author = {Abbas, Amira and King, Robbie and Huang, Hsin-Yuan and Huggins, William J. and Movassagh, Ramis and Gilboa, Dar and McClean, Jarrod R.},
month = may,
year = {2023},
note = {arXiv:2305.13362 [quant-ph]},
keywords = {Quantum Physics, Computer Science - Machine Learning},
annote = {Comment: 29 pages, 2 figures},
file = {Abbas et al. - 2023 - On quantum backpropagation, information reuse, and.pdf:/Users/georgehuang/Zotero/storage/LWJ8YDD4/Abbas et al. - 2023 - On quantum backpropagation, information reuse, and.pdf:application/pdf},
}
Downloads: 0
{"_id":"n8hQga22MJahQNbj6","bibbaseid":"abbas-king-huang-huggins-movassagh-gilboa-mcclean-onquantumbackpropagationinformationreuseandcheatingmeasurementcollapse-2023","author_short":["Abbas, A.","King, R.","Huang, H.","Huggins, W. J.","Movassagh, R.","Gilboa, D.","McClean, J. R."],"bibdata":{"bibtype":"misc","type":"misc","title":"On quantum backpropagation, information reuse, and cheating measurement collapse","url":"http://arxiv.org/abs/2305.13362","abstract":"The success of modern deep learning hinges on the ability to train neural networks at scale. Through clever reuse of intermediate information, backpropagation facilitates training through gradient computation at a total cost roughly proportional to running the function, rather than incurring an additional factor proportional to the number of parameters – which can now be in the trillions. Naively, one expects that quantum measurement collapse entirely rules out the reuse of quantum information as in backpropagation. But recent developments in shadow tomography, which assumes access to multiple copies of a quantum state, have challenged that notion. Here, we investigate whether parameterized quantum models can train as efficiently as classical neural networks. We show that achieving backpropagation scaling is impossible without access to multiple copies of a state. With this added ability, we introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources while reducing classical auxiliary computational costs to open problems in shadow tomography. These results highlight the nuance of reusing quantum information for practical purposes and clarify the unique difficulties in training large quantum models, which could alter the course of quantum machine learning.","language":"en","urldate":"2023-06-27","publisher":"arXiv","author":[{"propositions":[],"lastnames":["Abbas"],"firstnames":["Amira"],"suffixes":[]},{"propositions":[],"lastnames":["King"],"firstnames":["Robbie"],"suffixes":[]},{"propositions":[],"lastnames":["Huang"],"firstnames":["Hsin-Yuan"],"suffixes":[]},{"propositions":[],"lastnames":["Huggins"],"firstnames":["William","J."],"suffixes":[]},{"propositions":[],"lastnames":["Movassagh"],"firstnames":["Ramis"],"suffixes":[]},{"propositions":[],"lastnames":["Gilboa"],"firstnames":["Dar"],"suffixes":[]},{"propositions":[],"lastnames":["McClean"],"firstnames":["Jarrod","R."],"suffixes":[]}],"month":"May","year":"2023","note":"arXiv:2305.13362 [quant-ph]","keywords":"Quantum Physics, Computer Science - Machine Learning","annote":"Comment: 29 pages, 2 figures","file":"Abbas et al. - 2023 - On quantum backpropagation, information reuse, and.pdf:/Users/georgehuang/Zotero/storage/LWJ8YDD4/Abbas et al. - 2023 - On quantum backpropagation, information reuse, and.pdf:application/pdf","bibtex":"@misc{abbas_quantum_2023,\n\ttitle = {On quantum backpropagation, information reuse, and cheating measurement collapse},\n\turl = {http://arxiv.org/abs/2305.13362},\n\tabstract = {The success of modern deep learning hinges on the ability to train neural networks at scale. Through clever reuse of intermediate information, backpropagation facilitates training through gradient computation at a total cost roughly proportional to running the function, rather than incurring an additional factor proportional to the number of parameters – which can now be in the trillions. Naively, one expects that quantum measurement collapse entirely rules out the reuse of quantum information as in backpropagation. But recent developments in shadow tomography, which assumes access to multiple copies of a quantum state, have challenged that notion. Here, we investigate whether parameterized quantum models can train as efficiently as classical neural networks. We show that achieving backpropagation scaling is impossible without access to multiple copies of a state. With this added ability, we introduce an algorithm with foundations in shadow tomography that matches backpropagation scaling in quantum resources while reducing classical auxiliary computational costs to open problems in shadow tomography. These results highlight the nuance of reusing quantum information for practical purposes and clarify the unique difficulties in training large quantum models, which could alter the course of quantum machine learning.},\n\tlanguage = {en},\n\turldate = {2023-06-27},\n\tpublisher = {arXiv},\n\tauthor = {Abbas, Amira and King, Robbie and Huang, Hsin-Yuan and Huggins, William J. and Movassagh, Ramis and Gilboa, Dar and McClean, Jarrod R.},\n\tmonth = may,\n\tyear = {2023},\n\tnote = {arXiv:2305.13362 [quant-ph]},\n\tkeywords = {Quantum Physics, Computer Science - Machine Learning},\n\tannote = {Comment: 29 pages, 2 figures},\n\tfile = {Abbas et al. - 2023 - On quantum backpropagation, information reuse, and.pdf:/Users/georgehuang/Zotero/storage/LWJ8YDD4/Abbas et al. - 2023 - On quantum backpropagation, information reuse, and.pdf:application/pdf},\n}\n\n","author_short":["Abbas, A.","King, R.","Huang, H.","Huggins, W. J.","Movassagh, R.","Gilboa, D.","McClean, J. R."],"key":"abbas_quantum_2023","id":"abbas_quantum_2023","bibbaseid":"abbas-king-huang-huggins-movassagh-gilboa-mcclean-onquantumbackpropagationinformationreuseandcheatingmeasurementcollapse-2023","role":"author","urls":{"Paper":"http://arxiv.org/abs/2305.13362"},"keyword":["Quantum Physics","Computer Science - Machine Learning"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"misc","biburl":"https://bibbase.org/network/files/MdCywnfEcRNyDtvne","dataSources":["yoC7aEqiuoMyGb9he"],"keywords":["quantum physics","computer science - machine learning"],"search_terms":["quantum","backpropagation","information","reuse","cheating","measurement","collapse","abbas","king","huang","huggins","movassagh","gilboa","mcclean"],"title":"On quantum backpropagation, information reuse, and cheating measurement collapse","year":2023}