The generalization ability of different sizes architectures with one and two hidden layers trained with backpropagation combined with early stopping have been analyzed. The dependence of the generalization process on the complexity of the function being implemented is studied using a recently introduced measure for the complexity of Boolean functions. For a whole set of Boolean symmetric functions it is found that large neural networks have a better generalization ability on a large complexity range of the functions in comparison to smaller ones and also
Leonardo Franco, José M. Jerez, José