We study the synthesis of neural coding, selective attention and perceptual decision making. We build a hierarchical neural architecture that implements Bayesian integration of noisy sensory input and top-down attentional priors, leading to sound perceptual discrimination. Many known psychophysical and neural consequences of attentional modulation can be captured within this framework. The model offers some explicit explanations for the way that prior information about one feature (visual location) can dramatically alter the inferential performance about a completely independent feature (orientation), as well as the provenance of multiplicative modulation of neural tuning curves. The model also illustrates a possible reconciliation of cortical and neuromodulatory representations of uncertainty.
Angela J. Yu, Peter Dayan