In this paper, we present the results of an experimental comparison among seven different weight initialization methods in twelve different problems. The comparison is performed by...
Abstract. In this paper we conduct a comparative study between hybrid methods to optimize multilayer perceptrons: a model that optimizes the architecture and initial weights of mul...
The use of multilayer perceptrons (MLP) with threshold functions (binary step function activations) greatly reduces the complexity of the hardware implementation of neural networks...
Vassilis P. Plagianakos, George D. Magoulas, Micha...
In this paper, three approaches are presented for generating and validating sequences of different size neural nets. First, a growing method is given along with several weight ini...
Pramod Lakshmi Narasimha, Walter Delashmit, Michae...
A genetic programming method is investigated for optimizing both the architecture and the connection weights of multilayer feedforward neural networks. The genotype of each networ...