We explore the use of a shoe-mounted camera as a sensory system for wearable computing. We demonstrate tools useful for gait analysis, obstacle detection, and context recognition. Using only visual information, we detect periods of stability and motion during walking. In the stable phase, the foot can be assumed to be parallel to the ground plane. In this condition, the floor dominates the lower part of the camera’s view, and we show that it can be segmented out from the remainder of the scene, leaving walls and obstacles. We also demonstrate floor surface recognition for context awareness.
Paul M. Fitzpatrick, Charles C. Kemp