This paper proposes an intuitive configuration tool for autonomic pervasive computing systems. Specifically the paper presents a system for the inference of user task intentions from a variety of sensed information and describes how this can be readily configured by users to meet their needs for performing particular activities in particular spaces. The approach taken combines the capture of user task concepts and policies with their integration into a 3D virtual reality model of the environment being used so that these concepts and policies can be mapped intuitively but accurately onto the sensed physical environment.