Much working time is spent in meetings and as a consequence meetings have become the subject of multidisciplinary research. Virtual Meeting Rooms (VMRs) are 3D virtual replicas of meeting rooms, where the various modalities such as speech, gaze, distance, gestures and facial expressions can be controlled. This allows VMRs to be used to improve remote meeting participation, to visualize multimedia data and as an instrument for research into social interaction in meetings. This paper describes how these three uses can be realized in the various stages in the development of a VMR. We describe the process from observation through annotation to simulation and a model that describes the relations between the annotated features of verbal and nonverbal conversational behavior. As an example of using a VMR for research into how a single modality can influence the perception of social interaction, we present an experiment in the VMR where humans had to identify the speaker in a multi-party conv...