This paper presents a new pruning method to determine a nearly optimum multi-layer neural network structure. The aim of the proposed method is to reduce the size of the network by freezing any node that does not actively participate in the training process. A node is not active if it has little or no effect to reduce the error of the network as the training proceeds. Experimental results demonstrate a moderate to nearly significant reduction in the network size and generalization performance. A notable improvement in the network’s training time is also observed.
Ali Farzan, Ali A. Ghorbani