We present a method for improving a 3D facial model by interactive feedback of mapping a texture obtained from a 3D scanner. The method is based on extracting features from both 3D model and texture, and deforming a generic head model. Feature texture area is first mapped onto the corresponding part of a generic head model. The generic model is then reconstructed depending on the result of texture mapping feedback. The difference of our approach with existing methods is that we model components of a face, e.g., mouth, eyes, nose etc., using customized 3D curves; this helps in preserving the shapes of features during interactive modifications. A Java 3D implementation is also developed following the proposed method.