Recently, boosting has come to be used widely in object-detection applications because of its impressive performance in both speed and accuracy. However, learning weak classifiers which is one of the most significant tasks in using boosting is left to users. In Discrete AdaBoost, weak classifiers with binary output are too weak to boost when the training data is complex. Meanwhile, determining the appropriate number of bins for weak classifiers learned by Real AdaBoost is a challenging task because small ones might not accurately approximate the real distribution while large ones might cause over-fitting, increase computation time and waste storage space. We have developed Ent-Boost, a novel boosting scheme for efficiently learning weak classifiers using entropy measures. Class entropy information is used to automatically estimate the optimal number of bins through discretization process. Then Kullback–Leibler divergence which is the relative entropy between probability distri...