We present a method for tracking a hand while it is interacting
with an object. This setting is arguably the one where
hand-tracking has most practical relevance, but poses significant
additional challenges: strong occlusions by the object
as well as self-occlusions are the norm, and classical
anatomical constraints need to be softened due to the external
forces between hand and object. To achieve robustness
to partial occlusions, we use an individual local tracker for
each segment of the articulated structure. The segments
are connected in a pairwise Markov random field, which
enforces the anatomical hand structure through soft constraints
on the joints between adjacent segments. The most
likely hand configuration is found with belief propagation.
Both range and color data are used as input. Experiments
are presented for synthetic data with ground truth and for
real data of people manipulating objects.