Images and videos can be indexed by multiple features at different levels, such as color, texture, motion, and text annotation. Organizing this information into a system so that users can query effectively is a challenging and important problem. In this paper, we present VISMap, a visual information seeking system that extends the traditional query paradigms of query-by-example and query-by-sketch and replaces the models of relevance feedback with principles from information visualization and concept representation. Users no longer perform lengthy "one-shot" queries or rely on hidden relevance feedback mechanisms. Instead, we provide a rich set of tools that allow users to construct personal views of the video database and directly visualize and manipulate various views and comprehend effects of individual query criteria on the final search results. The set of tools include: 1) a feature space browser for feature-based exploration and navigation, 2) a distance map for metric...