Local Feature Weighting in Nearest Prototype Classification. Fernandez, F. & Isasi, P. IEEE Transactions on Neural Networks, 19(1):40–53, jan, 2008.
Local Feature Weighting in Nearest Prototype Classification [link]Paper  doi  abstract   bibtex   
The distance metric is the corner stone of nearest neighbor (NN)-based methods, and therefore, of nearest prototype (NP) algorithms. That is because they classify depending on the similarity of the data. When the data is characterized by a set of features which may contribute to the classification task in different levels, feature weighting or selection is required, sometimes in a local sense. However, local weighting is typically restricted to NN approaches. In this paper, we introduce local feature weighting (LFW) in NP classification. LFW provides each prototype its own weight vector, opposite to typical global weighting methods found in the NP literature, where all the prototypes share the same one. Providing each prototype its own weight vector has a novel effect in the borders of the Voronoi regions generated: They become nonlinear. We have integrated LFW with a previously developed evolutionary nearest prototype classifier (ENPC). The experiments performed both in artificial and real data sets demonstrate that the resulting algorithm that we call LFW in nearest prototype classification (LFW-NPC) avoids overfitting on training data in domains where the features may have different contribution to the classification task in different areas of the feature space. This generalization capability is also reflected in automatically obtaining an accurate and reduced set of prototypes. \textcopyright 2007 IEEE.
@article{Fernandez2008a,
abstract = {The distance metric is the corner stone of nearest neighbor (NN)-based methods, and therefore, of nearest prototype (NP) algorithms. That is because they classify depending on the similarity of the data. When the data is characterized by a set of features which may contribute to the classification task in different levels, feature weighting or selection is required, sometimes in a local sense. However, local weighting is typically restricted to NN approaches. In this paper, we introduce local feature weighting (LFW) in NP classification. LFW provides each prototype its own weight vector, opposite to typical global weighting methods found in the NP literature, where all the prototypes share the same one. Providing each prototype its own weight vector has a novel effect in the borders of the Voronoi regions generated: They become nonlinear. We have integrated LFW with a previously developed evolutionary nearest prototype classifier (ENPC). The experiments performed both in artificial and real data sets demonstrate that the resulting algorithm that we call LFW in nearest prototype classification (LFW-NPC) avoids overfitting on training data in domains where the features may have different contribution to the classification task in different areas of the feature space. This generalization capability is also reflected in automatically obtaining an accurate and reduced set of prototypes. {\textcopyright} 2007 IEEE.},
author = {Fernandez, F. and Isasi, P.},
doi = {10.1109/TNN.2007.902955},
file = {:home/fernando/papers/tmp/04359199.pdf:pdf},
issn = {1045-9227},
journal = {IEEE Transactions on Neural Networks},
keywords = {Evolutionary learning,Local feature weighting (LFW),Nearest prototype (NP) classification,Weighted Euclidean distance},
month = {jan},
number = {1},
pages = {40--53},
title = {{Local Feature Weighting in Nearest Prototype Classification}},
url = {http://ieeexplore.ieee.org/document/4359199/},
volume = {19},
year = {2008}
}
Downloads: 0