Music is a very important part of our lives. People enjoy listening to the music, and many of us find a special pleasure in creating the music. Computers further extended many aspects of our musical experience. Listening to, recording, and creating music is now easier and more accessible to various users. On the other hand, various computing applications exploit the music in order to better support the interaction with users. However, listening to music is generally a passive experience. Although we may change many parameters, the music we listen to generally does not reflect our response, or does so very roughly. In this paper we present a flexible framework that enables active creation of instrumental music based of the implicit dynamics and content of human-computer interaction. Our approach is application independent, and it provides a mapping of musical to the abstraction of user interaction. This mapping is based on analysis of the dynamic and content of the humancomputer intera...