Abstract. This paper proposes a new vision-based system that can extract walking parameters from human demonstration. The system uses only a non-calibrated USB webcam connected to a standard PC, and the human is only required to put three color patches on one of his legs and walk roughly in a perpendicular plane with respect to camera orientation. The walking parameters are then extracted in real time, using a local tracking system to follow the markers and a fast decision layer to detect the main features of the leg movement. As only one leg can be tracked properly using only one camera, we assume symmetric movement for left and right legs. Once extracted, the parameters have been successfully tested by generating walking sequences for both simulated and real Robo-Erectus humanoid robots.