An error concealment algorithm is proposed based on flow of facial expression to improve communication of animated facial data over a limited bandwidth and error prone channel. Facial expression flow is tracked using dominant muscles which are those with maximum change between two successive frames. The receiver uses linear interpolation and the information on facial expression flow to interpolate the erroneous facial animation data. Experimental results are provided to show that the proposed error concealment method improves the quality of an animated face communications.
Insu Park, Shahram Shirani, David W. Capson