— We propose a new family of classification algorithms in the spirit of support vector machines, that builds in non-conservative protection to noise and controls overfitting. Our formulation is based on a softer version of robust optimization called comprehensive robustness. We show that this formulation is equivalent to regularization by any arbitrary convex regularizer. We explain how the connection of comprehensive robustness to convex riskmeasures can be used to design risk-constrained classifiers with robustness to the input distribution. Our formulations lead to easily solved convex problems. Empirical results show the promise of comprehensive robust classifiers in handling risk sensitive classification.