We describe a uniform technique for representing both sensory data and the attentional state of an agent using a subset of modal logic with indexicals. The resulting representation maps naturally into feed-forward parallel networks or can be implemented on stock hardware using bit-mask instructions. The representation has \circuit-semantics" (Rosenschein and Kaelbling 1986)(Nilsson 1994), but can eciently represent propositions containing modals and unary predicates and functions. We describe an example using Kluge, a vision-based mobile robot, programmed to perform a fetch-and-deliver task.1