Real scenes are full of specularities (highlights and reflections), and yet most vision algorithms ignore them. In order to capture the appearance of realistic scenes, we need to model specularities as separate layers. In this paper, we study the behavior of specularities in static scenes as the camera moves, and describe their dependence on varying surface geometry, orientation, and scene point and camera locations. For a rectilinear camera motion with constant velocity, we study how the specular motion deviates from a straight trajectory (disparity deviation) and how much it violates the epipolar constraint (epipolar deviation). Surprisingly, for surfaces that are convex or not highly undulating, these deviations are usually quite small. We also study the appearance of specularities, i.e., how they interact with the body reflection, and with the usual occlusion ordering constraints applicable to diffuse opaque layers. We present a taxonomy of specularities based on their photometric...