In *Structural, Syntactic and Statistical Pattern Recognition. Lecture Notes in Computer Science.*, volume 3138, pages 707–715, 2004. Springer.

Link abstract bibtex

Link abstract bibtex

In linear discriminant (LD) analysis high sample size/feature ratio is desirable. The linear programming procedure (LP) for LD identification handles the curse of dimensionality through simultaneous minimization of the L1 norm of the classification errors and the LD weights. The sparseness of the solution - the fraction of features retained- can be controlled by a parameter in the objective function. By qualitatively analyzing the objective function and the constraints of the problem, we show why sparseness arises. In a sparse solution, large values of the LD weight vector reveal those individual features most important for the decision boundary.

@inproceedings{pranckeviciene2004control, title={Control of sparseness for feature selection}, author={Pranckeviciene, Erinija and Baumgartner, Richard and Somorjai, Ray and Bowman, Christopher}, booktitle={Structural, Syntactic and Statistical Pattern Recognition. Lecture Notes in Computer Science.}, volume={3138}, pages={707--715}, year={2004}, publisher={Springer}, abstract={In linear discriminant (LD) analysis high sample size/feature ratio is desirable. The linear programming procedure (LP) for LD identification handles the curse of dimensionality through simultaneous minimization of the L1 norm of the classification errors and the LD weights. The sparseness of the solution - the fraction of features retained- can be controlled by a parameter in the objective function. By qualitatively analyzing the objective function and the constraints of the problem, we show why sparseness arises. In a sparse solution, large values of the LD weight vector reveal those individual features most important for the decision boundary.}, keywords={pattern recognition, discrete mathematics, artificial intelligence, linear programming, L1-norm support vector machine}, url_Link={http://link.springer.com/chapter/10.1007/978-3-540-27868-9_77} }

Downloads: 0