May, 2023. arXiv:2306.00061 [quant-ph, stat]

Paper abstract bibtex

Paper abstract bibtex

Quantum machine learning is often highlighted as one of the most promising uses for a quantum computer to solve practical problems. However, a major obstacle to the widespread use of quantum machine learning models in practice is that these models, even once trained, still require access to a quantum computer in order to be evaluated on new data. To solve this issue, we suggest that following the training phase of a quantum model, a quantum computer could be used to generate what we call a classical shadow of this model, i.e., a classically computable approximation of the learned function. While recent works already explore this idea and suggest approaches to construct such shadow models, they also raise the possibility that a completely classical model could be trained instead, thus circumventing the need for a quantum computer in the first place. In this work, we take a novel approach to define shadow models based on the frameworks of quantum linear models and classical shadow tomography. This approach allows us to show that there exist shadow models which can solve certain learning tasks that are intractable for fully classical models, based on widely-believed cryptography assumptions. We also discuss the (un)likeliness that all quantum models could be shadowfiable, based on common assumptions in complexity theory.

@misc{jerbi_shadows_2023, title = {Shadows of quantum machine learning}, url = {http://arxiv.org/abs/2306.00061}, abstract = {Quantum machine learning is often highlighted as one of the most promising uses for a quantum computer to solve practical problems. However, a major obstacle to the widespread use of quantum machine learning models in practice is that these models, even once trained, still require access to a quantum computer in order to be evaluated on new data. To solve this issue, we suggest that following the training phase of a quantum model, a quantum computer could be used to generate what we call a classical shadow of this model, i.e., a classically computable approximation of the learned function. While recent works already explore this idea and suggest approaches to construct such shadow models, they also raise the possibility that a completely classical model could be trained instead, thus circumventing the need for a quantum computer in the first place. In this work, we take a novel approach to define shadow models based on the frameworks of quantum linear models and classical shadow tomography. This approach allows us to show that there exist shadow models which can solve certain learning tasks that are intractable for fully classical models, based on widely-believed cryptography assumptions. We also discuss the (un)likeliness that all quantum models could be shadowfiable, based on common assumptions in complexity theory.}, language = {en}, urldate = {2023-07-07}, publisher = {arXiv}, author = {Jerbi, Sofiene and Gyurik, Casper and Marshall, Simon C. and Molteni, Riccardo and Dunjko, Vedran}, month = may, year = {2023}, note = {arXiv:2306.00061 [quant-ph, stat]}, keywords = {Quantum Physics, Computer Science - Machine Learning, Statistics - Machine Learning, Computer Science - Artificial Intelligence}, annote = {Comment: 7 + 16 pages, 5 figures}, file = {Jerbi et al. - 2023 - Shadows of quantum machine learning.pdf:/Users/georgehuang/Zotero/storage/DPY7W2BF/Jerbi et al. - 2023 - Shadows of quantum machine learning.pdf:application/pdf}, }

Downloads: 0