This paper describes the concept and control of a 3d Avatar system with mimic and gestures as a conversational user interface. The Avatar system including gestures and mimic is based on morphing techniques. It allows to generate sign language and mouth motion at lip reading quality in real time. The concept of a Speech Act is introduced and a table is defined to classify conversation fragments. Furthermore, two rule-based systems have been implemented to control the Avatar’s behavior and gestures. The system is shown to be useful in several application areas. Keywords Sign language, Lip Reading, Gesture, Morphing, Conversational Interfaces