We describe a new framework for efficiently computing and storing global illumination effects for complex, animated environments. The new framework allows the rapid generation of sequencesrepresenting any arbitrary path in a “view space” within an environment in which both the viewer and objects move. The global illumination is stored as time sequencesof range images at base locations that span the view space. We present algorithms for determining locations for these base images, and the time steps required to adequatelycapture the effects of object motion. We also present algorithms for computing the global illumination in the base images that exploit spatial and temporal coherenceby considering direct and indirect illumination separately. We discuss an initial implementation using the new framework. Results from our implementation demonstrate the efficient generation of multiple tours through a complex space, and a tour of an environment in which objects move.
Jeffry Nimeroff, Julie Dorsey, Holly E. Rushmeier