Pre-Pruning and Post-Pruning are two standard methods of dealing with noise in concept learning. Pre-Pruning methods are very efficient, while Post-Pruning methods typically are moreaccurate, but muchslower, because they have to generate an overly specific concept description first. Wehave experimented with a variety of pruning methods, including two new methods that try to combine and integrate pre- and postpruning in order to achieve both accuracy and efficiency. This is verified with test series in a chess position classification task.