Prototype classifiers trained with multi-class classification objective are inferior in pattern retrieval and outlier rejection. To improve the binary classification (detection, verification, retrieval, outlier rejection) performance of prototype classifiers, we propose a onevs-all training method, which enriches each prototype as a binary discriminant function with a local threshold, and optimizes both the prototype vectors and the thresholds on training data using a binary classification objective, the cross-entropy (CE). Experimental results on two OCR datasets show that prototype classifiers trained by the one-vs-all method is superior in both multi-class classification and binary classification.