A major difficulty in building Bayesian network models is the size of conditional probability tables, which grow exponentially in the number of parents. One way of dealing with this problem is through parametric conditional probability distributions that usually require only a linear number of parameters in the number of parents. In this paper we introduce a new class of parametric models, the pICI models, that aim at lowering the number of parameters required to specify local probability distributions, but are still capable of modeling a variety of interactions. A subset of the pICI models are decomposable and this leads to significantly faster inference as compared to models that cannot be decomposed. We also show that the pICI models are especially useful for parameter learning from small data sets and this leads to higher accuracy than learning CPTs.
Adam Zagorecki, Mark Voortman, Marek J. Druzdzel