— Negative Correlation Learning (NCL) has been showing to outperform other ensemble learning approaches in off-line mode. A key point to the success of NCL is that the learning of an ensemble member is influenced by the learning of the others, directly encouraging diversity. However, when applied to on-line learning, NCL presents the problem that part of the diversity has to be built a priori, as the same sequence of training data is sent to all the ensemble members. In this way, the choice of the base models to be used is limited and the use of more adequate neural network models for the problem to be solved may be not possible. This paper proposes a new method to perform on-line learning based on NCL and On-line Bagging. The method directly encourages diversity, as NCL, but sends a different sequence of training data to each one of the base models in an on-line bagging way. So, it allows the use of deterministic base models such as Evolving Fuzzy Neural Networks (EFuNNs), which ar...
Fernanda L. Minku, Xin Yao