Several kernel algorithms have recently been proposed for nonlinear discriminant analysis. However, these methods mainly address the singularity problem in the high dimensional feature space. Less attention has been focused on the properties of the resulting discriminant vectors and feature vectors in the reduced dimensional space. In this paper, we present a new formulation for kernel discriminant analysis. The proposed formulation includes, as special cases, kernel uncorrelated discriminant analysis (KUDA) and kernel orthogonal discriminant analysis (KODA). The feature vectors of KUDA are uncorrelated, while the discriminant vectors of KODA are orthogonal to each other in the feature space. We present theoretical derivations of proposed KUDA and KODA algorithms. The experimental results show that both KUDA and KODA are very competitive in comparison with other nonlinear discriminant algorithms in terms of classification accuracy.