The Minimax Probability Machine (MPM) constructs a classifier, which provides a worst-case bound on the probability of misclassification of future data points based on reliable estimates of means and covariance matrices of the classes from the training data points, and achieves the comparative performance with a state-of-the-art classifier, the Support Vector Machine. In this paper, we eliminate the assumption of the unbiased weight for each class in the MPM and develop a critical extension, named Biased Minimax Probability Machine (BMPM), to deal with biased classification tasks, especially in the medical diagnostic applications. We outline the theoretical derivatives of the BMPM. Moreover, we demonstrate that this model can be transformed into a concave-convex Fractional Programming (FP) problem or a pseudoconcave problem. After illustrating our model with a synthetic dataset and applying it to the real-world medical diagnosis datasets, we obtain encouraging and promising experi...
Kaizhu Huang, Haiqin Yang, Irwin King, Michael R.