This paper presents a method to retarget the motion of a character to another in real-time. The technique is based on inverse rate control, which computes the changes in joint angles corresponding to the changes in end-effector position. While tracking the multiple end-effector trajectories of the original subject or character, our on-line motion retargetting also minimizes the joint angle differences by exploiting the kinematic redundancies of the animated model. This method can generalize a captured motion for another anthropometry to perform slightly different motion, while preserving the original motion characteristics. Because the above is done in on-line, a real-time performance can be mapped to other characters. Moreover, if the method is used interactively during motion capture session, the feedback of retargetted motion on the screen provides more chances to get satisfactory results. As a by-product, our algorithm can be used to reduce measurement errors in restoring captured...