We propose a general method for estimating the distance between a compact subspace K of the space L1 ([0, 1]s ) of Lebesgue measurable functions defined on the hypercube [0, 1]s , and the class of functions computed by artificial neural networks using a single hidden layer, each unit evaluating a sigmoidal activation function. Our lower bounds are stated in terms of an invariant that measures the oscillations of functions of the space K around the origin. As an application we estimate the minimal number of neurons required to approximate bounded functions satisfying uniform Lipschitz conditions of order α with accuracy . Key words: Mathematics of Neural Networks, Approximation Theory
José Luis Montaña, Cruz E. Borges