Recursive Feature Elimination (RFE) combined with feature ranking is an effective technique for eliminating irrelevant features when the feature dimension is large, but it is difficult to distinguish between relevant and redundant features. The usual method of determining when to stop eliminating features is based on either a validation set or cross-validation techniques. In this paper, we present feature selection criteria based on out-of-bootstrap (OOB) and class separability, both computed on the training set thereby obviating the need for validation. The RFE method described in this paper uses a two-class neural network classifier and the ranking of features is based on the magnitude of neural network weights. This approach is compared experimentally with a noisy bootstrapped version of Fisher’s Linear Discriminant (FLD) to rank features. The techniques are extended to multi-class problems using the Error-Correcting Output Coding (ECOC) method. Experimental investigation on artif...