- In this paper we investigate mixture of experts problems in the context of Local-Global Neural Networks. This type of architecture was originaly conceived for functional approxim...
Abstract. In this paper we derive an upper bound for the average-case generalization error of the mixture of experts modular neural network, based on an average-case generalization...
- In this paper, we show how a topographic mapping can be created from a product of experts. We learn the parameters of the mapping using gradient descent on the negative logarithm...
This paper presents the design and evaluation of a text categorization method based on the Hierarchical Mixture of Experts model. This model uses a divide and conquer principle to ...
We investigate a form of modular neural network for classification with (a) pre-separated input vectors entering its specialist (expert) networks, (b) specialist networks which ar...