Search and retrieval of specific musical content such as emotive or sonic features has become an important aspect of Music Information Retrieval system development, but only little research is user-oriented. We summarize results of an elaborate user-study that explores who the users of music information retrieval systems are and what structural descriptions of music best characterize their understanding of music expression. Our study reveals that perceived qualities of music are affected by the context of the user. Subject dependencies are found for age, music expertise, musicianship, taste and familiarity with the music. Furthermore, interesting relationships are discovered between expressive and structural features. These findings are validated by means of a Semantic Music Recommender System prototype. The demonstration system recommends music from a database containing the quality ratings provided by the participants in a music annotation experiment. A test in the real world reveale...