For the approximation of time-dependent data tensors and of solutions to tensor differential equations by tensors of low Tucker rank, we study a computational approach that can be viewed as a continuous-time updating procedure. This approach works with the increments rather than the full tensor and avoids the computation of decompositions of large matrices. In this method, the derivative is projected onto the tangent space of the manifold of tensors of Tucker rank (r1, . . . , rN ) at the current approximation. This yields nonlinear differential equations for the factors in a Tucker decomposition, suitable for numerical integration. Approximation properties of this approach are analyzed. Key words. Low-rank approximation, time-varying tensors, continuous updating, Tucker decomposition, tensor differential equations. AMS subject classifications. 15A18,15A69,65F99,65L05