In this paper, we present a method for tracking and retexturing of garments that exploits the entire image information using the optical flow constraint instead of working with distinct features. In a hierarchical framework we refine the motion model with every level. The motion model is used to regularize the optical flow field such that finding the best transformation amounts in minimizing an error function that can be solved in a least squares sense. Knowledge about the position and deformation of the garment in 2D allows us to erase the old texture and replace it by a new one with correct deformation and shading properties without 3D reconstruction. Additionally, it provides an estimation of the irradiance such that the new texture can be illuminated realistically.