Rules are a type of human-understandable knowledge, and rule-based methods are very popular in building decision support systems. However, most current rule based classification systems build small classifiers where no rules account for exceptional instances and a default prediction plays a major role in the prediction. In this paper, we discuss two schemes to build rule based classifiers using multiple and negative target rules. In such schemes, negative rules pick up exceptional instances and multiple rules provide alternative predictions. The default prediction is removed and hence all predictions relate to rules providing explanations for the predictions. One risk for building a large rule based classifier is that it may overfit training data and results in low predictive accuracy. We show experimentally that one classifier is more accurate than a benchmark rule based classifier, C4.5rules.