The minimum-distance classifier summarizes each class with a prototype and then uses a nearest neighbor approach for classification. Three drawbacks of the original minimum-distance classifier are its inability to work with symbolic attributes, weigh attributes, and learn more than a single prototype for each class. The proposed solutions to these problems include defining the mean for symbolic attributes, providing a weighting metric, and learning several possible prototypes for each class. The learning algorithm developed to tackle these problems, SNMC, increases classification accuracy by 10% over the original minimum-distance classifier and has a higher average generalization accuracy than both C4.5 and PEBLS on 20 domains from the UC1 data repository.
Piew Datta, Dennis F. Kibler