To increase the believability and life-likeness of Embodied Conversational Agents (ECAs), we introduce a behavior synthesis technique for the generation of expressive gesturing. A small set of dimensions of expressivity is used to characterize individual variability of movement. We empirically evaluate our implementation in two separate user studies. The results suggest that our approach works well for a subset of expressive behavior. However, animation fidelity is not high enough to realize subtle changes. Interaction effects between different parameters need to be studied further. Categories and Subject Descriptors H5.2 [Information Interfaces and Presentation]: User Interfaces — evaluation/methodology General Terms Experimentation Keywords Embodied Conversational Agents