The lure of using motion vision as a fundamental element in the perception of space drives this effort to use flow features as the sole cues for robot mobility. Real-time estimates of image flow and flow divergence provide the robot’s sense of space. The robot steers down a conceptual corridor, comparing left and right peripheral flows. Large central flow divergence warns the robot of impending collisions at “dead ends.” When this occurs, the robot turns around and resumes wandering. Behavior is generated by directly using flow-based information in the 2-D image sequence; no 3-D reconstruction is attempted. Active mechanical gaze stabilization simplifies the visual interpretation problems by reducing camera rotation. By combining corridor following and dead-end deflection, the robot has wandered around the lab at 30 cm/s for as long as 20 minutes without collision. The ability to support this behavior in real-time with current equipment promises expanded capabilities as ...