This paper proposes a multilevel logic approach to output coding using multilevel neurons in the output layer. Training convergence for a single multilevel perceptron is considered. It has been found that a multilevel neural network classifier with a reduced number of outputs is often able to learn faster and requires fewer weights. Concepts are illustrated with an example of a digit classifier.
Aleksander Malinowski, Tomasz J. Cholewo, Jacek M.