We present a hardware-accelerated method for video-based rendering relying on an approximate model of scene geometry. Our goal is to render high-quality views of the scene from arbitrary viewpoints in real-time, using as input synchronized video-streams from only a small number of calibrated cameras distributed around the scene. Despite the fact that only a very coarse geometry reconstruction is possible, our rendering approach based on textured billboards preserves the details present in the source images. By exploiting multi-texturing and blending facilities of modern graphics cards, we achieve real-time frame-rates on current off-the-shelf hardware. One of the possible applications of our algorithm is to use it in an inexpensive system to display 3D-videos, which the user can watch from an interactively chosen viewpoint, or ultimately even live 3D-television.
Bastian Goldlücke, Marcus A. Magnor