In this paper we present an approach to simultaneously estimate gaze directions of multiple people in the view of a panoramic camera. Human faces are located and tracked using a probabilistic skin-color model and motion detection. Neural networks are used to estimate head poses of the detected faces. With this approach, it is possible to simultaneously track the locations of multiple people around a meeting table and estimate their gaze directions using only a panoramic camera. We have achieved an accuracy of 9 degrees for head pan estimation and 6 degrees for tilt estimation for a multi-user system.