Generalisation Bounds for Kernel PCA through PAC-Bayes Learning. Haddouche, M., Guedj, B., & Shawe-Taylor, J. Stat, 2024. Paper Pdf abstract bibtex 5 downloads Principal Component Analysis (PCA) is a popular method for dimension reduction and has attracted an unfailing interest for decades. Recently, kernel PCA has emerged as an extension of PCA but, despite its use in practice, a sound theoretical understanding of kernel PCA is missing. In this paper, we contribute lower and upper bounds on the efficiency of kernel PCA, involving the empirical eigenvalues of the kernel Gram matrix. Two bounds are for fixed estimators, and two are for randomized estimators through the PAC-Bayes theory. We control how much information is captured by kernel PCA on average, and we dissect the bounds to highlight strengths and limitations of the kernel PCA algorithm. Therefore, we contribute to the better understanding of kernel PCA. Our bounds are briefly illustrated on a toy numerical example.
@article{haddouche2020upper,
title={Generalisation Bounds for {Kernel PCA} through {PAC-Bayes} Learning},
author={Maxime Haddouche and Benjamin Guedj and John Shawe-Taylor},
year={2024},
journal ={Stat},
abstract = {Principal Component Analysis (PCA) is a popular method for dimension reduction and has attracted an unfailing interest for decades. Recently, kernel PCA has emerged as an extension of PCA but, despite its use in practice, a sound theoretical understanding of kernel PCA is missing. In this paper, we contribute lower and upper bounds on the efficiency of kernel PCA, involving the empirical eigenvalues of the kernel Gram matrix. Two bounds are for fixed estimators, and two are for randomized estimators through the PAC-Bayes theory. We control how much information is captured by kernel PCA on average, and we dissect the bounds to highlight strengths and limitations of the kernel PCA algorithm. Therefore, we contribute to the better understanding of kernel PCA. Our bounds are briefly illustrated on a toy numerical example.},
url = {https://arxiv.org/abs/2012.10369},
url_PDF = {https://arxiv.org/pdf/2012.10369.pdf},
eprint={2012.10369},
archivePrefix={arXiv},
primaryClass={cs.LG},
keywords={mine}
}
Downloads: 5
{"_id":"47kiubzm6s7y94oub","bibbaseid":"haddouche-guedj-shawetaylor-generalisationboundsforkernelpcathroughpacbayeslearning-2024","author_short":["Haddouche, M.","Guedj, B.","Shawe-Taylor, J."],"bibdata":{"bibtype":"article","type":"article","title":"Generalisation Bounds for Kernel PCA through PAC-Bayes Learning","author":[{"firstnames":["Maxime"],"propositions":[],"lastnames":["Haddouche"],"suffixes":[]},{"firstnames":["Benjamin"],"propositions":[],"lastnames":["Guedj"],"suffixes":[]},{"firstnames":["John"],"propositions":[],"lastnames":["Shawe-Taylor"],"suffixes":[]}],"year":"2024","journal":"Stat","abstract":"Principal Component Analysis (PCA) is a popular method for dimension reduction and has attracted an unfailing interest for decades. Recently, kernel PCA has emerged as an extension of PCA but, despite its use in practice, a sound theoretical understanding of kernel PCA is missing. In this paper, we contribute lower and upper bounds on the efficiency of kernel PCA, involving the empirical eigenvalues of the kernel Gram matrix. Two bounds are for fixed estimators, and two are for randomized estimators through the PAC-Bayes theory. We control how much information is captured by kernel PCA on average, and we dissect the bounds to highlight strengths and limitations of the kernel PCA algorithm. Therefore, we contribute to the better understanding of kernel PCA. Our bounds are briefly illustrated on a toy numerical example.","url":"https://arxiv.org/abs/2012.10369","url_pdf":"https://arxiv.org/pdf/2012.10369.pdf","eprint":"2012.10369","archiveprefix":"arXiv","primaryclass":"cs.LG","keywords":"mine","bibtex":"@article{haddouche2020upper,\ntitle={Generalisation Bounds for {Kernel PCA} through {PAC-Bayes} Learning}, \nauthor={Maxime Haddouche and Benjamin Guedj and John Shawe-Taylor},\nyear={2024},\njournal ={Stat},\nabstract = {Principal Component Analysis (PCA) is a popular method for dimension reduction and has attracted an unfailing interest for decades. Recently, kernel PCA has emerged as an extension of PCA but, despite its use in practice, a sound theoretical understanding of kernel PCA is missing. In this paper, we contribute lower and upper bounds on the efficiency of kernel PCA, involving the empirical eigenvalues of the kernel Gram matrix. Two bounds are for fixed estimators, and two are for randomized estimators through the PAC-Bayes theory. We control how much information is captured by kernel PCA on average, and we dissect the bounds to highlight strengths and limitations of the kernel PCA algorithm. Therefore, we contribute to the better understanding of kernel PCA. Our bounds are briefly illustrated on a toy numerical example.},\nurl = {https://arxiv.org/abs/2012.10369},\nurl_PDF = {https://arxiv.org/pdf/2012.10369.pdf},\neprint={2012.10369},\narchivePrefix={arXiv},\nprimaryClass={cs.LG},\nkeywords={mine}\n}\n\n","author_short":["Haddouche, M.","Guedj, B.","Shawe-Taylor, J."],"key":"haddouche2020upper","id":"haddouche2020upper","bibbaseid":"haddouche-guedj-shawetaylor-generalisationboundsforkernelpcathroughpacbayeslearning-2024","role":"author","urls":{"Paper":"https://arxiv.org/abs/2012.10369"," pdf":"https://arxiv.org/pdf/2012.10369.pdf"},"keyword":["mine"],"metadata":{"authorlinks":{}},"downloads":5,"html":""},"bibtype":"article","biburl":"https://bguedj.github.io/files/bguedj-publications.bib","dataSources":["suE7RgYeZEnSYr5Fy"],"keywords":["mine"],"search_terms":["generalisation","bounds","kernel","pca","through","pac","bayes","learning","haddouche","guedj","shawe-taylor"],"title":"Generalisation Bounds for Kernel PCA through PAC-Bayes Learning","year":2024,"downloads":5}