Conventional human-computer interfaces for the exploration of volume datasets employ the mouse as an input device. Specifying an oblique orientation for a crosssectional plane through the dataset using such interfaces requires an indirect approach involving a combination of actions that must be learned by the user. In this paper we propose a new interface model that aims to provide an intuitive means of orienting and translating a crosssectional plane through a volume dataset. Our model uses a hand-held rectangular panel that is manipulated by the user in free space, resulting in corresponding manipulations of the cross-sectional plane through the dataset. A basic implementation of the proposed model was evaluated relative to a conventional mouse-based interface in a controlled experiment in which users were asked to find a series of targets within a specially designed volume dataset. The results of the experiment indicate that users experienced significantly less workload and better ...