— We present a strategy for generating real-time relative depth maps of an environment from optical flow, under general motion. We achieve this using an insect-inspired hemispherical fish-eye sensor with 190 degree FOV, and a de-rotated optical flow field. The de-rotation algorithm applied is based on the theoretical work of Nelson and Aloimonos [12], who outline an algorithm for obtaining all rotational components of motion on a sphere about any great circle. From this we may obtain the translational component of motion, and construct full relative depth maps on the sphere. We demonstrate the robustness of this strategy in both simulation and real-world results. To our knowledge, this is the first demonstrated implementation of the Nelson and Aloimonos algorithm working in real-time, over real image sequences. These preliminary results provide a compelling argument for the global interpretation of optical flow under spherical projection when inferring scene structure. They als...