— We propose a feature selection criterion based on kernel discriminant analysis (KDA) for an -class problem, which finds eigenvectors on which the projected class data are locally maximally separated. The proposed criterion is the sum of the objective function values of KDA associated with the eigenvectors. The criterion results in calculating the sum of eigenvalues associated with the eigenvectors and is shown to be monotonic for the deletion or addition of features. Using the backward feature selection strategy, for several multi-class data sets, we evaluated the proposed criterion and the criterion based on the recognition rate of the support vector machine (SVM) evaluated by cross-validation. From the standpoint of generalization ability the proposed criterion is comparable with the SVM-based recognition rate, although the proposed method does not use cross-validation.