This paper describes a novel network model, which is able to control its growth on the basis of the approximation requests. Two classes of self-tuning neural models are considered; namely Growing Neural Gas (GNG) and SoftMax function networks. We combined the two models into a new one: hence the name GNG-Soft networks. The resulting model is characterized by the effectiveness of the GNG in distributing the units within the input space and the approximation properties of SoftMax functions. We devised a method to estimate the approximation error in an incremental fashion. This measure has been used to tune the network growth rate. Results showing the performance of the network in a real-world robotic experiment are reported.
A. Carlevarino, R. Martinotti, Giorgio Metta, Giul