For many years now, there is a growing interest around ROC curve for characterizing machine learning performances. This is particularly due to the fact that in real-world problems misclassification costs are not known and thus, ROC curve and related metrics such as the Area Under ROC curve (AUC) can be a more meaningful performance measures. In this paper, we propose a quadratic programming based algorithm for AUC maximization and show that under certain conditions 2-norm soft margin Support Vector Machines can also maximize AUC. We present experiments that compare SVMs performances to those of other AUC maximization based algorithms and provide empirical analysis of SVMs behavior with regards to ROC- based metrics. Our main conclusion is that SVMs can maximize both AUC and accuracy compared to other algorithms like RankBoost that optimize only AUC 2 .