A novel neural network model is described that implements context-dependent learning of complex sequences. The model utilises leaky integrate-and-fire neurons to extract timing information from its input and modifies its weights using a learning rule with synaptic noise. Learning and recall phases are seamlessly integrated so that the network can gradually shift from learning to predicting its input. Experimental results using data from the real-world problem domain demonstrate that the use of context has three important benefits: (a) it prevents catastrophic interference during learning of multiple overlapping sequences, (b) it enables the completion of sequences from missing or noisy patterns, and (c) it provides a mechanism to selectively explore the space of learned sequences during free recall. Key words. contextual cueing, incidental learning, leaky integrate-and-fire neurons, recurrent neural network, sequence learning
Luc Berthouze, Adriaan G. Tijsseling