Behavioral experiments suggest that insects make use of the apparent image speed on their compound eyes to navigate through obstacles, control flight speed, land smoothly, and measure the distance they have flown. However, the vast majority of electrophysiological recordings from motion-sensitive insect neurons show responses which are tuned in spatial and temporal frequency and are thus unable to unambiguously represent image speed. We suggest that this contradiction may be resolved at an early stage of visual motion processing using nondirectional motion sensors that respond proportionally to image speed until their peak response. We describe and characterize a computational model of these sensors and propose a model by which a spatial collation of such sensors could be used to generate speed-dependent behavior.
Charles M. Higgins