This paper introduces scale transforms to measure rhythmic similarity between two musical pieces. The rhythm of a piece of music is described by the scale transform magnitude, computed by transforming the sample autocorrelation of its onset strength signal to the scale domain. Then, two pieces can be compared without the impact of tempo differences by using simple distances between these descriptors like the cosine distance. A widely used dance music dataset has been chosen for proof of concept. On this data set, the proposed method based on scale transform achieves classification results as high as other state of the art approaches. On a second data set, which is characterized by much larger intra-class tempo variance, the scale transform based measure improves classification compared to previously presented measures by 41%.