We propose an information fusion approach to tracking objects from different viewpoints that can detect and recover from tracking failures. We introduce a reliability measure that is a combination of terms associated with correlation-based template matching and the epipolar geometry of the cameras. The measure is computed to evaluate the performance of 2D trackers in each camera view and detect tracking failures. The 3D object trajectory is constructed using stereoscopy and evaluated to predict the next 3D position of the object. In case of track loss in one camera view, the projection of the predicted 3D position onto the image plane of this view is used to reinitialize the lost 2D tracker. We conducted experiments with 34 subjects to evaluate our proposed system on videos of facial feature movements during human-computer interaction. The system successfully detected feature loss and gave promising results on accurate re-initialization of the feature.