Sciweavers

SSPR
2004
Springer

Bounds for the Average Generalization Error of the Mixture of Experts Neural Network

14 years 5 months ago
Bounds for the Average Generalization Error of the Mixture of Experts Neural Network
Abstract. In this paper we derive an upper bound for the average-case generalization error of the mixture of experts modular neural network, based on an average-case generalization error bound for an isolated neural network. By doing this we also generalize a previous bound for this architecture that was restricted to special problems. We also present a correction factor for the original average generalization error, that was empirically obtained, that yields more accurate error bounds for the 6 data sets used in the experiments. These experiments illustrate the validity of the derived error bound for the mixture of experts modular neural network and show how it can be used in practice.
Luís A. Alexandre, Aurélio C. Campil
Added 02 Jul 2010
Updated 02 Jul 2010
Type Conference
Year 2004
Where SSPR
Authors Luís A. Alexandre, Aurélio C. Campilho, Mohamed S. Kamel
Comments (0)