A cost-sensitive extension of boosting, denoted as asymmetric boosting, is presented. Unlike previous proposals, the new algorithm is derived from sound decision-theoretic principles, which exploit the statistical interpretation of boosting to determine a principled extension of the boosting loss. Similarly to AdaBoost, the cost-sensitive extension minimizes this loss by gradient descent on the functional space of convex combinations of weak learners, and produces large margin detectors. It is shown that asymmetric boosting is fully compatible with AdaBoost, in the sense that it becomes the latter when errors are weighted equally. Experimental evidence is provided to demonstrate the claims of cost-sensitivity and large margin. The algorithm is also applied to the computer vision problem of face detection, where it is shown to outperform a number of previous heuristic proposals for cost-sensitive boosting (AdaCost, CSB0, CSB1, CSB2, asymmetricAdaBoost, AdaC1, AdaC2 and AdaC3).