We develop a wearable vision system that consists of a user’s visual direction sensor and stereo cameras. First, we establish a method for calibrating the system so that it can detect user’s blink points even in a real situation such that the depth of blink points changes. Next, we propose a method for detecting a gazing region of a user in terms of the planar convex polygon. In our method, the system first identifies the fixation point of a user, and then applies a stereo algorithm and robust statistics to detect his gazing region. Now the system can detect the gazing region of a user and provide him with its 3D position.