A relational probability tree (RPT) is a type of decision tree that can be used for probabilistic classification of instances with a relational structure. Each leaf of an RPT contains a probability model that determines for each class the probability that an instance belongs to that class. The only kind of probability models that have been used in RPTs so far are multinomial probability distributions. In this paper we show how to integrate a more complex kind of probability models based on the concept of combining rules (such as noisy-or) into RPTs. We introduce two learning algorithms for such RPTs and experimentally compare these algorithms to the learning algorithm for standard RPTs. The experiments indicate that the use of probability models based on combining rules does not significantly influence the quality of the probability estimates of the RPTs. We perform additional experiments to investigate the reason for this result. The main conclusion is that probability models based o...