Sciweavers

ICMLA
2003

The Consolidation of Neural Network Task Knowledge

14 years 1 months ago
The Consolidation of Neural Network Task Knowledge
— Fundamental to the problem of lifelong machine learning is how to consolidate the knowledge of a learned task within a long-term memory structure (domain knowledge) without the loss of prior knowledge. We investigate the effect of curriculum, ie. the order in which tasks are learned, on the consolidation of task knowledge. Relevant background material on knowledge transfer and consolidation using multiple task learning (MTL) neural networks is reviewed. A large MTL network is used as the long-term memory structure and task rehearsal overcomes the stability-plasticity problem and the loss of prior knowledge. Experimental results demonstrate that curriculum has an important effect on the accuracy of consolidated knowledge particularly for the first few tasks that are learned. The results also suggest that, for given set of tasks and training examples, the mean accuracy of consolidated domain knowledge converges to the same level regardless of the curriculum.
Daniel L. Silver, Peter McCracken
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2003
Where ICMLA
Authors Daniel L. Silver, Peter McCracken
Comments (0)