Rhythm, beat and meter are key concepts of music in general. Many efforts had been made in the last years to automatically extract beat and meter from a piece of music given either in audio or symbolical representation (see e.g. [11] for an overview). In this paper we propose a new method for extracting beat, meter and phase information from a list of unquantized onset times. The procedure relies on a novel method called ’Gaussification’ and adopts correlation techniques combined with findings from music psychology for parameter settings.