Discriminative Joint Vector And Component Reduction For Gaussian Mixture Models. Bar-Yosef, Y. & Bistritz, Y. In 2019 27th European Signal Processing Conference (EUSIPCO), pages 1-5, Sep., 2019.
Discriminative Joint Vector And Component Reduction For Gaussian Mixture Models [pdf]Paper  doi  abstract   bibtex   
We introduce a discriminative parametric vector dimensionality reduction algorithm for Gaussian mixtures that is performed jointly with mixture component reduction. The reduction algorithm is based on the variational maximum mutual information (VMMI) method, which in contrast to other reduction algorithms, requires only the parameters of existing high order and high dimensional mixture models. The idea behind the proposed approach, called JVC-VMMI (for joint vector and component VMMI), differs significantly from traditional classification approaches that perform separately dimensionality reduction first, and then use the low-dimensional feature vector for training lower order models. The fact that the JVC-VMMI approach is relieved from using the original data samples admits an extremely efficient computation of the reduced models optimized for the classification task. We report experiments in vowel classification in which JVC-VMMI outperformed conventional Linear Discriminant Analysis (LDA) and Neighborhood Component Analysis (NCA) dimensionality reduction methods.

Downloads: 0