In this paper, we present a 3D registration algorithm based on simulated physical force/moment for articulated human motion tracking. Provided with sparsely reconstructed 3D human surface points from multiple synchronized cameras, the tracking problem is equivalent to fitting the 3D model to the scene points. The simulated physical force/ moment generated by the displacement between the model and the scene points is used to align the model with the scene points in an Iterative Closest Points (ICP) [1] approach. We further introduce a hierarchical scheme for model state updating, which automatically incorporates human kinematic constraints. Experimental results on both synthetic and real data from several unconstrained motion sequences demonstrate the efficiency and robustness of our proposed method.
Bingbing Ni, Stefan Winkler, Ashraf A. Kassim