Kernel Multivariate Analysis Framework for Supervised Subspace Learning: A Tutorial on Linear and Kernel Multivariate Methods. Arenas-Garcia, J., Petersen, K., Camps-Valls, G., & Hansen, L. IEEE Signal Processing Magazine, 30(4):16--29, July, 2013. 00011
doi  abstract   bibtex   
Feature extraction and dimensionality reduction are important tasks in many fields of science dealing with signal processing and analysis. The relevance of these techniques is increasing as current sensory devices are developed with ever higher resolution, and problems involving multimodal data sources become more common. A plethora of feature extraction methods are available in the literature collectively grouped under the field of multivariate analysis (MVA). This article provides a uniform treatment of several methods: principal component analysis (PCA), partial least squares (PLS), canonical correlation analysis (CCA), and orthonormalized PLS (OPLS), as well as their nonlinear extensions derived by means of the theory of reproducing kernel Hilbert spaces (RKHSs). We also review their connections to other methods for classification and statistical dependence estimation and introduce some recent developments to deal with the extreme cases of large-scale and low-sized problems. To illustrate the wide applicability of these methods in both classification and regression problems, we analyze their performance in a benchmark of publicly available data sets and pay special attention to specific real applications involving audio processing for music genre prediction and hyperspectral satellite image processing for Earth and climate monitoring.
@article{ arenas-garcia_kernel_2013,
  title = {Kernel {Multivariate} {Analysis} {Framework} for {Supervised} {Subspace} {Learning}: {A} {Tutorial} on {Linear} and {Kernel} {Multivariate} {Methods}},
  volume = {30},
  issn = {1053-5888},
  shorttitle = {Kernel {Multivariate} {Analysis} {Framework} for {Supervised} {Subspace} {Learning}},
  doi = {10.1109/MSP.2013.2250591},
  abstract = {Feature extraction and dimensionality reduction are important tasks in many fields of science dealing with signal processing and analysis. The relevance of these techniques is increasing as current sensory devices are developed with ever higher resolution, and problems involving multimodal data sources become more common. A plethora of feature extraction methods are available in the literature collectively grouped under the field of multivariate analysis (MVA). This article provides a uniform treatment of several methods: principal component analysis (PCA), partial least squares (PLS), canonical correlation analysis (CCA), and orthonormalized PLS (OPLS), as well as their nonlinear extensions derived by means of the theory of reproducing kernel Hilbert spaces (RKHSs). We also review their connections to other methods for classification and statistical dependence estimation and introduce some recent developments to deal with the extreme cases of large-scale and low-sized problems. To illustrate the wide applicability of these methods in both classification and regression problems, we analyze their performance in a benchmark of publicly available data sets and pay special attention to specific real applications involving audio processing for music genre prediction and hyperspectral satellite image processing for Earth and climate monitoring.},
  number = {4},
  journal = {IEEE Signal Processing Magazine},
  author = {Arenas-Garcia, J. and Petersen, K. and Camps-Valls, G. and Hansen, L.K.},
  month = {July},
  year = {2013},
  note = {00011},
  pages = {16--29}
}

Downloads: 0