In this paper, two modified constrained learning algorithms are proposed to obtain better generalization performance and faster convergence rate. The additional cost terms of the ...
In this paper, Parallel Evolutionary Algorithms for integer weight neural network training are presented. To this end, each processor is assigned a subpopulation of potential solut...
Max and min operations have interesting properties that facilitate the exchange of information between the symbolic and real-valued domains. As such, neural networks that employ m...
In this paper a neural network for approximating function is described. The activation functions of the hidden nodes are the Radial Basis Functions (RBF) whose parameters are learn...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewisepolynomial activation functions and arbitrary real weights can be simulated for Boolea...
— In the context of Independent Component Analysis (ICA), we propose a simple method for online estimation of activation functions in order to blindly separate instantaneous mixt...
Radial basis function networks (RBF) are efficient general function approximators. They show good generalization performance and they are easy to train. Due to theoretical consider...