In this paper, we propose a method of personal positioning for a wearable Augmented Reality (AR) system that allows a user to freely move around indoors and outdoors. The user is equipped with selfcontained sensors, a wearable camera, an inertial head tracker and display. The method is based on sensor fusion of estimates for relative displacement caused by human walking locomotion and estimates for absolute position and orientation within a Kalman filtering framework. The former is based on intensive analysis of human walking behavior using self-contained sensors. The latter is based on image matching of video frames from a wearable camera with an image database that was prepared beforehand. keywords : personal positioning, pedometer, human walking analysis, sensor fusion