Abstract. In this paper, some new approaches to training set size reduction are presented. These schemes basically consist of defining a small number of prototypes that represent all the original instances. Although the ultimate aim of the algorithms proposed here is to obtain a strongly reduced training set, the performance is empirically evaluated over nine real datasets by comparing the reduction rate and the classification accuracy with those of other condensing techniques.