Sciweavers

298 search results - page 48 / 60
» Optimizing number of hidden neurons in neural networks
Sort
View
CORR
2007
Springer
94views Education» more  CORR 2007»
13 years 7 months ago
Statistical tools to assess the reliability of self-organizing maps
Results of neural network learning are always subject to some variability, due to the sensitivity to initial conditions, to convergence to local minima, and, sometimes more dramat...
Eric de Bodt, Marie Cottrell, Michel Verleysen
GECCO
2008
Springer
363views Optimization» more  GECCO 2008»
13 years 9 months ago
Towards high speed multiobjective evolutionary optimizers
One of the major difficulties when applying Multiobjective Evolutionary Algorithms (MOEA) to real world problems is the large number of objective function evaluations. Approximate...
A. K. M. Khaled Ahsan Talukder
IJCNN
2007
IEEE
14 years 2 months ago
Analyzing the Fuzzy ARTMAP Matchtracking mechanism with Co-Objective Optimization Theory
— In the process of learning a pattern I, the Fuzzy ARTMAP algorithm templates (i.e., the weight vectors corresponding to nodes of its category representation layer) compete for ...
José Castro, Michael Georgiopoulos, Jimmy S...
TNN
2010
176views Management» more  TNN 2010»
13 years 2 months ago
On the weight convergence of Elman networks
Abstract--An Elman network (EN) can be viewed as a feedforward (FF) neural network with an additional set of inputs from the context layer (feedback from the hidden layer). Therefo...
Qing Song
GECCO
2011
Springer
274views Optimization» more  GECCO 2011»
12 years 11 months ago
Fuzzy dynamical genetic programming in XCSF
—A number of representation schemes have been presented for use within Learning Classifier Systems, ranging from binary encodings to Neural Networks, and more recently Dynamical ...
Richard Preen, Larry Bull