We study two-layer belief networks of binary random variables in which the conditional probabilities Pr childjparents depend monotonically on weighted sums of the parents. In large networks where exact probabilistic inference is intractable, we show how to compute upper and lower bounds on many probabilities of interest. In particular, using methods from large deviation theory, we derive rigorous bounds on marginal probabilities such as Pr children and prove rates of convergencefor the accuracy of our bounds as a function of network size. Our results apply to networkswith generic transfer function parameterizations of the conditional probability tables, such as sigmoid and noisy-OR. They also explicitly illustrate the types of averaging behavior that can simplify the problem of inference in large networks.
Michael J. Kearns, Lawrence K. Saul