AdaBoost and support vector machines (SVM) algorithms are commonly used in the field of object recognition. As classifiers, their classification performance is sensitive to affected by feature sets. To improve this performance, in addition to using the classifiers for accurate selection of feature sets, attention must be given to determining which feature subset to use in the classifier. Evaluating feature sets using a margin of the decision boundary of an SVM classifier proposed by Kugler is a solution for this problem. However, the margin in an SVM is sometimes large due to outliers. This paper presents a feature selection method that uses a contribution ratio based on boosting, which is effective for evaluating features. By comparing our method to the conventional one that uses a confident margin, we found that our method can select better feature sets using the contribution ratio obtained from boosting.