We present context-sensitive Multiple Task Learning, or csMTL as a method of inductive transfer. It uses additional contextual inputs along with other input features when learning ...
The authors extended the idea of training multiple tasks simultaneously on a partially shared feed forward network. A shared input subvector was added to represented common inputs...
This is an introductory book about machine learning. Notice that this is a draft book. It may contain typos, mistakes, etc.
The book covers the following topics: Boolean Functio...
— Negative Correlation Learning (NCL) has been showing to outperform other ensemble learning approaches in off-line mode. A key point to the success of NCL is that the learning o...
Recurrent neural networks are theoretically capable of learning complex temporal sequences, but training them through gradient-descent is too slow and unstable for practical use i...