This paper introduces a framework to track 3D human movement using Gaussian process dynamic model (GPDM) and particle filter. The framework combines the particle filter and discriminative learning approaches so that the 3D human model is not needed and optimal proposal distribution can be used. The structure of the joint motion and appearance are modelled using GPDM in a low dimensional space. Relevance vector machine (RVM) is used to construct the regression mapping between image latent space and joint angle latent space using the small training data set. Backward mapping from appearance to motion latent space makes the samples better drawn according to the most recent observation. Forward mapping from joint angle to silhouettes makes computation fast without generating synthetic images in tracking for particle weight evaluation. The experimental results show that our approach can track 3D people movement accurately given noisy image and different subjects' movements.