In this paper we propose a novel classification algorithm that fits models of different complexity on separate regions of the input space. The goal is to achieve a balance between global and local learning strategies by decomposing the classification task into simpler subproblems; each task narrows the learning problem to a local region of high example density over the input space. Specifically, our proposed approach is to apply a clustering algorithm to every set of training examples that belong to the same class; each cluster becomes an intermediate concept that is learned by selecting a model with an (estimated) optimal degree of complexity. Experimental results on real-world domains show consistent good performance in predictive accuracy with our piece-wise model fitting strategy.