We present a simple and efficient approach to turn laser-scanned human geometry into a realistically moving virtual avatar. Instead of relying on the classical skeleton-based animation pipeline, our method uses a mesh-based Laplacian editing scheme to drive the motion of the scanned model. Our framework elegantly solves the motion retargeting problem and produces realistic non-rigid surface deformation with minimal user interaction. Realistic animations can easily be generated from a variety of input motion descriptions, which we exemplify by applying our method to both marker-free and marker-based motion capture data.