How to measure and model the similarity between different music items is one of the most fundamental yet challenging research problems in music information retrieval. This paper demonstrates a novel multimodal and adaptive music similarity measure (CompositeMap) with its application in a personalized multimodal music search system. CompositeMap can effectively combine music properties from different aspects into compact signatures via supervised learning, which lays the foundation for effective and efficient music search. In addition, an incremental Locality Sensitive Hashing algorithm is developed to support more efficient search processes. Experimental results based on two large music collections reveal various advantages in effectiveness, efficiency, adaptiveness, and scalability of the proposed music similarity measure and the music search system. Categories and Subject Descriptors H.3.3 [Information Search and Retrieval]: Query formulation, Search process; H.5.5 [Sound and M...