Abstract. Research on temporal textures has concerned mainly modeling, synthesis and detection, but not finding changes between different temporal textures. Shot change detection, based on appearance, has received much research attention, but detection of changes between temporal textures has not been addressed sufficiently. Successive temporal textures in a video often have a similar appearance but different motion, a change that shot change detection cannot discern. In this paper, changes between temporal textures are captured by deriving a non-parametric statistical model for the motions via a novel approach, based on properties of the Fourier transform. Motion statistics are used in a sequential change detection test to find changes in the motion distributions, and consequently the temporal textures. Experiments use a wide range of videos of temporal textures, groups of people, traffic. The proposed approach leads to correct change detection, at a low computational cost.