A cost-sensitive extension of Real Adaboost denoted as asymmetric Real Adaboost(RAB) is proposed. The two main differences between Asymmetric RAB and the na¨ıve RAB are (1) a Chernoff measurement is used to evaluate the best weak classifier during training, rather than a Bhattacharyya measurement used in na¨ıve RAB, and (2) the weights are updated separately for positives and negatives at each boosting step. The upper bound on training error is also provided. Experiment results are shown to demonstrate its cost-sensitivity when selecting weak classifiers, and also show that it outperforms previously proposed cost-sensitive extensions of Discrete Adaboost(DAB) and several extensions of Real Adaboost. Besides, it also consumes much less time than previously proposed DAB extensions.