Multiclass Ridge-adjusted Slack Variable Optimization using selected basis for fast classification. Yu, Y., Diamantaras, K. I., McKelvey, T., & Kung, S. Y. In 2014 22nd European Signal Processing Conference (EUSIPCO), pages 1178-1182, Sep., 2014.
Paper abstract bibtex Kernel techniques for classification is especially challenging in terms of computation and memory requirement when data fall into more than two categories. In this paper, we extend a binary classification technique called Ridge-adjusted Slack Variable Optimization (RiSVO) to its multiclass counterpart where the label information encoding scheme allows the computational complexity to remain the same to the binary case. The main features of this technique are summarized as follows: (1) Only a subset of data are pre-selected to construct the basis for kernel computation; (2) Simultaneous active training set selection for all classes helps reduce complexity meanwhile improving robustness; (3) With the proposed active set selection criteria, inclusion property is verified empirically. Inclusion property means that once a pattern is excluded, it will no longer return to the active training set and therefore can be permanently removed from the training procedure. This property greatly reduce the complexity. The proposed techniques are evaluated on standard multiclass datasets MNIST, USPS, pendigits and letter which could be easily compared with existing results.
@InProceedings{6952415,
author = {Y. Yu and K. I. Diamantaras and T. McKelvey and S. Y. Kung},
booktitle = {2014 22nd European Signal Processing Conference (EUSIPCO)},
title = {Multiclass Ridge-adjusted Slack Variable Optimization using selected basis for fast classification},
year = {2014},
pages = {1178-1182},
abstract = {Kernel techniques for classification is especially challenging in terms of computation and memory requirement when data fall into more than two categories. In this paper, we extend a binary classification technique called Ridge-adjusted Slack Variable Optimization (RiSVO) to its multiclass counterpart where the label information encoding scheme allows the computational complexity to remain the same to the binary case. The main features of this technique are summarized as follows: (1) Only a subset of data are pre-selected to construct the basis for kernel computation; (2) Simultaneous active training set selection for all classes helps reduce complexity meanwhile improving robustness; (3) With the proposed active set selection criteria, inclusion property is verified empirically. Inclusion property means that once a pattern is excluded, it will no longer return to the active training set and therefore can be permanently removed from the training procedure. This property greatly reduce the complexity. The proposed techniques are evaluated on standard multiclass datasets MNIST, USPS, pendigits and letter which could be easily compared with existing results.},
keywords = {computational complexity;optimisation;signal classification;inclusion property;active training set selection;computational complexity;label information encoding scheme;binary classification technique;fast classification;multiclass RiSVO;ridge-adjusted slack variable optimization;Training;Kernel;Training data;Support vector machines;Vectors;Equations;Optimization;RiSVO;kernel;multiclass classification;large scale data;RKHS basis construction},
issn = {2076-1465},
month = {Sep.},
url = {https://www.eurasip.org/proceedings/eusipco/eusipco2014/html/papers/1569924411.pdf},
}
Downloads: 0
{"_id":"R6fw5ZKD557Wte2uW","bibbaseid":"yu-diamantaras-mckelvey-kung-multiclassridgeadjustedslackvariableoptimizationusingselectedbasisforfastclassification-2014","authorIDs":[],"author_short":["Yu, Y.","Diamantaras, K. I.","McKelvey, T.","Kung, S. Y."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["Y."],"propositions":[],"lastnames":["Yu"],"suffixes":[]},{"firstnames":["K.","I."],"propositions":[],"lastnames":["Diamantaras"],"suffixes":[]},{"firstnames":["T."],"propositions":[],"lastnames":["McKelvey"],"suffixes":[]},{"firstnames":["S.","Y."],"propositions":[],"lastnames":["Kung"],"suffixes":[]}],"booktitle":"2014 22nd European Signal Processing Conference (EUSIPCO)","title":"Multiclass Ridge-adjusted Slack Variable Optimization using selected basis for fast classification","year":"2014","pages":"1178-1182","abstract":"Kernel techniques for classification is especially challenging in terms of computation and memory requirement when data fall into more than two categories. In this paper, we extend a binary classification technique called Ridge-adjusted Slack Variable Optimization (RiSVO) to its multiclass counterpart where the label information encoding scheme allows the computational complexity to remain the same to the binary case. The main features of this technique are summarized as follows: (1) Only a subset of data are pre-selected to construct the basis for kernel computation; (2) Simultaneous active training set selection for all classes helps reduce complexity meanwhile improving robustness; (3) With the proposed active set selection criteria, inclusion property is verified empirically. Inclusion property means that once a pattern is excluded, it will no longer return to the active training set and therefore can be permanently removed from the training procedure. This property greatly reduce the complexity. The proposed techniques are evaluated on standard multiclass datasets MNIST, USPS, pendigits and letter which could be easily compared with existing results.","keywords":"computational complexity;optimisation;signal classification;inclusion property;active training set selection;computational complexity;label information encoding scheme;binary classification technique;fast classification;multiclass RiSVO;ridge-adjusted slack variable optimization;Training;Kernel;Training data;Support vector machines;Vectors;Equations;Optimization;RiSVO;kernel;multiclass classification;large scale data;RKHS basis construction","issn":"2076-1465","month":"Sep.","url":"https://www.eurasip.org/proceedings/eusipco/eusipco2014/html/papers/1569924411.pdf","bibtex":"@InProceedings{6952415,\n author = {Y. Yu and K. I. Diamantaras and T. McKelvey and S. Y. Kung},\n booktitle = {2014 22nd European Signal Processing Conference (EUSIPCO)},\n title = {Multiclass Ridge-adjusted Slack Variable Optimization using selected basis for fast classification},\n year = {2014},\n pages = {1178-1182},\n abstract = {Kernel techniques for classification is especially challenging in terms of computation and memory requirement when data fall into more than two categories. In this paper, we extend a binary classification technique called Ridge-adjusted Slack Variable Optimization (RiSVO) to its multiclass counterpart where the label information encoding scheme allows the computational complexity to remain the same to the binary case. The main features of this technique are summarized as follows: (1) Only a subset of data are pre-selected to construct the basis for kernel computation; (2) Simultaneous active training set selection for all classes helps reduce complexity meanwhile improving robustness; (3) With the proposed active set selection criteria, inclusion property is verified empirically. Inclusion property means that once a pattern is excluded, it will no longer return to the active training set and therefore can be permanently removed from the training procedure. This property greatly reduce the complexity. The proposed techniques are evaluated on standard multiclass datasets MNIST, USPS, pendigits and letter which could be easily compared with existing results.},\n keywords = {computational complexity;optimisation;signal classification;inclusion property;active training set selection;computational complexity;label information encoding scheme;binary classification technique;fast classification;multiclass RiSVO;ridge-adjusted slack variable optimization;Training;Kernel;Training data;Support vector machines;Vectors;Equations;Optimization;RiSVO;kernel;multiclass classification;large scale data;RKHS basis construction},\n issn = {2076-1465},\n month = {Sep.},\n url = {https://www.eurasip.org/proceedings/eusipco/eusipco2014/html/papers/1569924411.pdf},\n}\n\n","author_short":["Yu, Y.","Diamantaras, K. I.","McKelvey, T.","Kung, S. Y."],"key":"6952415","id":"6952415","bibbaseid":"yu-diamantaras-mckelvey-kung-multiclassridgeadjustedslackvariableoptimizationusingselectedbasisforfastclassification-2014","role":"author","urls":{"Paper":"https://www.eurasip.org/proceedings/eusipco/eusipco2014/html/papers/1569924411.pdf"},"keyword":["computational complexity;optimisation;signal classification;inclusion property;active training set selection;computational complexity;label information encoding scheme;binary classification technique;fast classification;multiclass RiSVO;ridge-adjusted slack variable optimization;Training;Kernel;Training data;Support vector machines;Vectors;Equations;Optimization;RiSVO;kernel;multiclass classification;large scale data;RKHS basis construction"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"inproceedings","biburl":"https://raw.githubusercontent.com/Roznn/EUSIPCO/main/eusipco2014url.bib","creationDate":"2021-02-13T17:43:41.674Z","downloads":0,"keywords":["computational complexity;optimisation;signal classification;inclusion property;active training set selection;computational complexity;label information encoding scheme;binary classification technique;fast classification;multiclass risvo;ridge-adjusted slack variable optimization;training;kernel;training data;support vector machines;vectors;equations;optimization;risvo;kernel;multiclass classification;large scale data;rkhs basis construction"],"search_terms":["multiclass","ridge","adjusted","slack","variable","optimization","using","selected","basis","fast","classification","yu","diamantaras","mckelvey","kung"],"title":"Multiclass Ridge-adjusted Slack Variable Optimization using selected basis for fast classification","year":2014,"dataSources":["A2ezyFL6GG6na7bbs","oZFG3eQZPXnykPgnE"]}