While decision trees have been used primarily for classification, they can also model regression or function approximation. Like classification trees, regression trees often yield excellent results with the advantages of strong explanatory capabilities and dynamic feature selection in high dimensions. In this paper, we contrast techniques for inducing classification and regression trees. We then present an alternative model of regression rules. A new technique for inducing decision rules for regression is described. This model of ordered disjunctive normal form rules, is potentially stronger and more compact than decision trees. Preliminary experimental results suggest that the new procedure is effective and can often exceed regression tree performance.
Sholom M. Weiss, Nitin Indurkhya