We present a view-based method for steering a robot in a network of positions; this includes navigation along a prerecorded path, but also allows for arbitrary movement of the robot between adjacent positions in the network. The approach uses an upward-looking omnidirectional camera; even a very modest quality of the optical system is sufficient, since all views are represented in terms of low-order basis functions (spherical harmonics). Motor control signals for the robot are derived from a differential matching approach; the computation of the involved gradient information is extremely simplified by exploiting the fact that all images are represented in terms of basis functions. The viability of the approach for steering the robot has been shown in extensive simulations using photorealistic views; the validity of these simulations in comparison to a tangible system implementation operating in a real indoor scene has been shown in previous investigations [5].