Abstract. This paper describes an integrated architecture for online collaborative multimedia (audio and text) meetings which supports the recording of participants' audio exchanges, automatic metadata generation and logging of users editing interaction and also information derived from the use of group awareness widgets (gesturing) for post-meeting processing and access. We propose a formal model for timestamping generation and manipulation of textual artefacts. Post-meeting processing of the interaction information highlight the usefulness of such histories in terms of tracking information that would be normally lost in usual collaborative editing settings. The potential applications of such automatic interaction history generation range from group interaction quantitative analysis, cooperation modelling, and multimedia meeting mining.