We propose a method for human full-body pose tracking from measurements of wearable inertial sensors. Since the data provided by such sensors is sparse, noisy and often ambiguous, we use a compound prior model of feasible human poses to constrain the tracking problem. Our model consists of several low-dimensional, activity-specific motion models and an efficient, sampling-based activity switching mechanism. We restrict the search space for pose tracking by means of manifold learning. Together with the portability of wearable sensors, our method allows us to track human full-body motion in unconstrained environments. In fact, we are able to simultaneously classify the activity a person is performing and estimate the full-body pose. Experiments on movement sequences containing different activities show that our method can seamlessly detect activity switches and precisely reconstruct full-body pose from the data of only six wearable inertial sensors.