In this paper, we derive lower and upper bounds for the probability of error for a linear classifier, where the random vectors representing the underlying classes obey the multivariate normal distribution. The expression of the error is derived in the one-dimensional space, independently of the dimensionality of the original problem. Based on the two bounds, we propose an approximating expression for the error of a generic linear classifier. In particular, we derive the corresponding bounds and the expression for approximating the error of Fisher's classifier. Our empirical results on synthetic data, including up to five-hundreddimensional featured samples, show that the computations for the error are extremely fast and quite accurate; the approximation differs from the actual error by at most = 0.0184340683.
Luís G. Rueda