Sciweavers

1160 search results - page 5 / 232
» Dynamic Programming Algorithm for Training Functional Networ...
Sort
View
NPL
2006
109views more  NPL 2006»
13 years 8 months ago
CB3: An Adaptive Error Function for Backpropagation Training
Effective backpropagation training of multi-layer perceptrons depends on the incorporation of an appropriate error or objective function. Classification-based (CB) error functions ...
Michael Rimer, Tony Martinez
TNN
2010
234views Management» more  TNN 2010»
13 years 3 months ago
Novel maximum-margin training algorithms for supervised neural networks
This paper proposes three novel training methods, two of them based on the back-propagation approach and a third one based on information theory for Multilayer Perceptron (MLP) bin...
Oswaldo Ludwig, Urbano Nunes
ICMCS
2006
IEEE
115views Multimedia» more  ICMCS 2006»
14 years 2 months ago
On Training Neural Network Algorithms for Odor Identification for Future Multimedia Communication Systems
Future multimedia communication system can be developed to identify, transmit and provide odors besides voice and image. In this paper, an improved odor identification method is i...
Ki-Hyeon Kwon, Namyong Kim, Hyung-Gi Byun, Krishna...
IJIT
2004
13 years 9 months ago
Improving the Convergence of the Backpropagation Algorithm Using Local Adaptive Techniques
Since the presentation of the backpropagation algorithm, a vast variety of improvements of the technique for training a feed forward neural networks have been proposed. This articl...
Z. Zainuddin, N. Mahat, Y. Abu Hassan
IJCAI
1997
13 years 9 months ago
Extracting Propositions from Trained Neural Networks
This paper presents an algorithm for extract­ ing propositions from trained neural networks. The algorithm is a decompositional approach which can be applied to any neural networ...
Hiroshi Tsukimoto