Abstract. It is known from psychology and neuroscience that multimodal integration of sensory information enhances the perception of stimuli that are corrupted in one or more modalities. A prominent example of this is that auditory perception of speech is enhanced when speech is bimodal, i.e. when it also has a visual modality. The function of the cortical network processing speech in auditory and visual cortices and in multimodal association areas, is modeled with a Multimodal Self-Organizing Network (MuSON), consisting of several Kohonen Self-Organizing Maps (SOM) with both feedforward and feedback connections. Simulations with heavily corrupted phonemes and uncorrupted letters as inputs to the MuSON demonstrate a strongly enhanced auditory perception. This is explained by feedback from the bimodal area into the auditory stream, as in cortical processing.
Andrew P. Paplinski, Lennart Gustafsson