We present a system capable of visually detecting pointing gestures and estimating the 3D pointing direction in real-time. We use Hidden Markov Models (HMMs) trained on different phases of sample pointing gestures to detect the occurrence of a gesture. For estimating the pointing direction, we compare two approaches: 1) The line of sight between head and hand and 2) the forearm orientation. Input features for the HMMs are the 3D trajectories of the person’s head and hands. They are extracted from image sequences provided by a stereo camera. In a person-independent test scenario, our system achieved a gesture detection rate of 88%. For 90% of the detected gestures, the correct pointing target (one out of eight objects) was identified.