Instead of the conventional background and foreground definition, we propose a novel method that decomposes a scene into time-varying background and foreground intrinsic images. The multiplication of these images reconstructs the scene. First, we form a set of previous images into a temporal scale and compute their spatial gradients. By taking advantage of the sparseness of the filter outputs, we estimate the background by median filtering the gradients, and compute the corresponding foreground using the background. We also propose a robust method to threshold foregrounds to obtain a change detection mask of the moving pixels. We show that a different set of filters can detect the static and moving lines. Computationally, the proposed method is comparable with the state of the art, and our simulations prove the effectiveness of the intrinsic background/foreground decomposition even under sudden and severe illumination changes.