Abstract. We present a performance analysis of three linear dimensionality reduction techniques: Fisher's discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), which aims to maximize the Chernoff distance in the transformed space. A comprehensive performance analysis of these methods combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data shows that LD and RH outperform FDA, specially in the quadratic classifier, which is strongly related to the Chernoff distance in the transformed space. In the case of the linear classifier, the superiority of RH over the other two methods is also demonstrated.