Multiply sectioned Bayesian networks (MSBNs) provide a coherent and flexible formalism for representing uncertain knowledge in large domains. Global consistency among subnets in a MSBN is achieved by communication. When a subnet updates its belief with respect to an adjacent subnet, existing inference operations require repeated belief propagations (proportional to the number of linkages between the two subnets) within the receiving subnet, making communication less efficient. We redefine these operations such that two such propagations are sufficient. We prove that the new operations, while improving the efficiency, do not compromise the coherence. A MSBN must be initialized before inference can take place. The initialization involves dedicated operations not shared by inference operations according to existing methods. We show that the new inference operations presented here unify inference and initialization. Hence the new operations are not only more efficient but also simpler. Th...