With the limited field of view of human vision, our perception of most scenes is built over time while our eyes are scanning the scene. In the case of static scenes this process can be modeled by panoramic mosaicing: stitching together images into a panoramic view. Can a dynamic scene, scanned by a video camera, be represented with a dynamic panoramic video even though different regions were visible at different times? In this paper we explore time flow manipulation in video, such as the creation of new videos in which events that occurred at different times are displayed simultaneously. More general changes in the time flow are also possible, which enable re-scheduling the order of dynamic events in the video, for example. We generate dynamic mosaics by sweeping the aligned space-time volume of the input video by a time front surface and generating a sequence of time slices in the process. Various sweeping strategies and different time front evolutions manipulate the time flow in the...