Abstract. Training neural networks is a complex task of great importance in the supervised learning field of research. In this work we tackle this problem with five algorithms, and try to offer a set of results that could hopefully foster future comparisons by following a kind of standard evaluation of the results (the Prechelt approach). To achieve our goal of studying in the same paper population based, local search, and hybrid algorithms, we have selected two gradient descent algorithms: Backpropagation and Levenberg-Marquardt, one population based heuristic such as a Genetic Algorithm, and two hybrid algorithms combining this last with the former local search ones. Our benchmark is composed of problems arising in Medicine, and our conclusions clearly establish the advantages of the proposed hybrids over the pure algorithms.
Enrique Alba, J. Francisco Chicano