Music visualization provides users with a new interface to browse, search, and navigate their personal digital music collection. Although there are several previous works on visualizing a music collection based on some “surface” musical metadata, such as artist, album and genre, there has been few works on content-based or perception-based visualizations. In this paper, we developed an algorithm to automatically estimate human perceptions on rhythm and timbre of a music clip. Then, based on these two values, each music clip is mapped into a 2D (timbre-rhythm) space. Thus a 2D perception-based visualization is built. Experimental evaluation indicates that this kind visualization is efficiently helpful in many cases of music management manipulations, such as music navigation, similar music search and music play list generation.