Feature selection is a problem of choosing a subset of relevant features. Researchers have been searching for optimal feature selection methods. `Branch and Bound' and Focus are two representatives. In general, only exhaustive search can bring about the optimal subset. However, under certain conditions, exhaustive search can be avoided without sacri cing the subset's optimality. One such condition is that there exists a monotonic measure with which `Branch and Bound' can guarantee an optimal subset. Unfortunately, most error- or distance-based measures are not monotonic. A new measure is employed in this work that is monotonic and fast to compute. With this measure, the search for relevant features is guaranteed to be complete but not exhaustive. An empirical study is conducted to show that the algorithm indeed lives up to what it claims. Some discussion is given at the end.