This paper presents a multi-view tracker, meant to operate in smart rooms that are equipped with multiple cameras. The cameras are assumed to be calibrated3 . In particular, we demonstrate a virtual classroom application, where the system automatically selects the camera with the ’best’ view on the face of a person moving in the room. Realtime object tracking, which is needed to achieve this, is implemented by means of color-based particle filtering. The use of multiple model histograms for the target (human head) results robust tracking, even when the view on the target changes considerably like from the front to the back. Information is shared between the cameras, which adds robustness to the system. Once one camera has lost the target, it can be reinitialized with the help of the epipolar constraints suggested by the others. Experiments in our research environment corroborate the effectiveness of the approach.