— Multimodal integration of sensory information has clear advantages for survival: events that can be sensed in more than one modality are detected more quickly and accurately, and if the sensory information is corrupted by noise the classification of the event is more robust in multimodal percepts than in the unisensory information. It is shown that using a Multimodal Self-Organizing Network (MuSON), consisting of several interconnected Kohonen Self-Organizing Maps (SOM), bimodal integration of phonemes, auditory elements of language, and letters, visual elements of language, can be simulated. Robustness of the bimodal percepts against noise in both the auditory and visual modalities is clearly demonstrated.
Lennart Gustafsson, Andrew P. Paplinski