In this paper we analyze and try to predict the gaze behavior of users navigating in virtual environments. We focus on first-person navigation in virtual environments which involves forward and backward motions on a ground-surface with turns toward the left or right. We found that gaze behavior in virtual reality, with input devices like mice and keyboards, is similar to the one observed in real life. Participants anticipated turns as in real life conditions, i.e. when they can actually move their body and head. We also found influences of visual occlusions and optic flow similar to the ones reported in existing literature on real navigations. Then, we propose three simple gaze prediction models taking as input: (1) the motion of the user as given by the rotation velocity of the camera on the yaw axis (considered here as the virtual heading direction), and/or (2) the optic flow on screen. These models were tested with data collected in various virtual environments. Results show th...