A new approach to the induction of multivariate decision trees is proposed. A linear decision function (hyper-plane) is used at each non-terminal node of a binary tree for splitting the data. The search strategy is based on the dipolar criterion functions and exploits the basis exchange algorithm as an optimization procedure. The feature selection is used to eliminate redundant and noisy features at each node. To avoid the problem of over-fitting the tree is pruned back after the growing phase. The results of experiments on some real-life datasets are presented and compared with obtained by state-of-art decision trees.