— We present a robust strategy for docking a mobile robot in close proximity with an upright surface using optical flow field divergence. Unlike previous approaches, we achieve this without the need for explicit segmentation of the surface in the image, and using complete optical estimation (i.e. no affine models) in the control loop. A simple proportional control law is used to regulate the vehicle’s velocity, using only the raw, unfiltered flow divergence as input. Central to the robustness of our approach is the derivation of a time-to-contact estimator that accounts for small rotations of the robot during ego-motion. We present both analytical and experimental results showing that through tracking of the focus of expansion to a looming surface, we may compensate for such rotations, thereby significantly improving the robustness of the time-to-contact estimate. This is demonstrated using an off-board natural image sequence, and in closed-loop control of a mobile robot.