In this paper, we describe a method for cloth animation in real-time. The algorithm works in a hybrid manner exploiting the merits of both the physical-based and geometric deformations. It makes use of predetermined conditions between the cloth and the body model, avoiding complex collision detection and physical deformations wherever possible. Garments are segmented into pieces that are simulated by various algorithms, depending on how they are laid on the body surface and whether they stick or flow on it. Tests show that the method is well suited to fully dressed virtual human models, achieving real-time performance compared to ordinary cloth-simulations. _________________________________________________________________________