Support vector (SV) machines are useful tools to classify populations characterized by abrupt decreases in density functions. At least for one class of Gaussian data model the SV classifier is not an optimal one according to a mean generalization error criterion. In real world problems, we have neither Gaussian populations nor data with sharp linear boundaries. Thus, the SV (maximal margin) classifiers can lose against other methods where more than a fixed number of supporting vectors contribute in determining the final weights of the classification and prediction rules. A good alternative to the linear SV machine is a specially trained and optimally stopped SLP in a transformed feature space obtained after decorrelating and scaling the multivariate data. 2000 Elsevier Science Ltd. All rights reserved.