— The lower and upper bounds for the information capacity of two-layer feedforward neural networks with binary interconnections, integer thresholds for the hidden units, and zero threshold for the output unit is obtained through two steps. First, through a constructive approach based on statistical analysis, it is shown that a specifically constructed (N 02L 01) network with N input units, 2L hidden units, and one output unit is capable of implementing, with almost probability one, any dichotomy of O(W= ln W) random samples drawn from some continuous distributions, where W is the total number of weights of the network. This quantity is then used as a lower bound for the information capacity C of all (N 0 2L 0 1) networks with binary weights. Second, an upper bound is obtained and shown to be O(W) by a simple counting argument. Therefore, we have (W= ln W) C O(W).