Generalized kernel framework for unsupervised spectral methods of dimensionality reduction. Peluffo-Ordonez, D., H., Lee, J., A., & Verleysen, M. In 2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM), pages 171-177, 12, 2014. IEEE.
Generalized kernel framework for unsupervised spectral methods of dimensionality reduction [link]Website  doi  abstract   bibtex   2 downloads  
This work introduces a generalized kernel perspective for spectral dimensionality reduction approaches. Firstly, an elegant matrix view of kernel principal component analysis (PCA) is described. We show the relationship between kernel PCA, and conventional PCA using a parametric distance. Secondly, we introduce a weighted kernel PCA framework followed from least-squares support vector machines (LS-SVM). This approach starts with a latent variable that allows to write a relaxed LS-SVM problem. Such a problem is addressed by a primal-dual formulation. As a result, we provide kernel alternatives to spectral methods for dimensionality reduction such as multidimensional scaling, locally linear embedding, and laplacian eigenmaps; as well as a versatile framework to explain weighted PCA approaches. Experimentally, we prove that the incorporation of a SVM model improves the performance of kernel PCA.
@inproceedings{
 title = {Generalized kernel framework for unsupervised spectral methods of dimensionality reduction},
 type = {inproceedings},
 year = {2014},
 pages = {171-177},
 websites = {http://ieeexplore.ieee.org/document/7008664/},
 month = {12},
 publisher = {IEEE},
 id = {70a2b317-3ab0-3920-9fea-104a43e03602},
 created = {2020-12-29T22:52:12.933Z},
 file_attached = {false},
 profile_id = {aba9653c-d139-3f95-aad8-969c487ed2f3},
 last_modified = {2021-02-20T22:05:33.580Z},
 read = {false},
 starred = {false},
 authored = {true},
 confirmed = {true},
 hidden = {false},
 citation_key = {Peluffo-Ordonez2014},
 private_publication = {false},
 abstract = {This work introduces a generalized kernel perspective for spectral dimensionality reduction approaches. Firstly, an elegant matrix view of kernel principal component analysis (PCA) is described. We show the relationship between kernel PCA, and conventional PCA using a parametric distance. Secondly, we introduce a weighted kernel PCA framework followed from least-squares support vector machines (LS-SVM). This approach starts with a latent variable that allows to write a relaxed LS-SVM problem. Such a problem is addressed by a primal-dual formulation. As a result, we provide kernel alternatives to spectral methods for dimensionality reduction such as multidimensional scaling, locally linear embedding, and laplacian eigenmaps; as well as a versatile framework to explain weighted PCA approaches. Experimentally, we prove that the incorporation of a SVM model improves the performance of kernel PCA.},
 bibtype = {inproceedings},
 author = {Peluffo-Ordonez, Diego H. and Lee, John Aldo and Verleysen, Michel},
 doi = {10.1109/CIDM.2014.7008664},
 booktitle = {2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM)}
}

Downloads: 2