Cognitive modeling with neural networks unrealistically ignores the role of knowledge in learning by starting from random weights. It is likely that effective use of knowledge by neural networks could significantly speed learning. A new algorithm, knowledge-based cascadecorrelation (KBCC), finds and adapts its relevant knowledge in new learning. Comparison to multi-task learning (MTL) reveals that KBCC uses its knowledge more effectively to learn faster.
Thomas R. Shultz, François Rivest