The approach proposed in this paper retrieves contours from transrectal ultrasound (TRUS) prostate images. The input images are sparsely annotated by radiologists for the purpose of brachytherapy planning and post-interventional monitoring. The theoretical contribution of the paper consists in the design of a task-oriented, bottom-up method which mimics perceptual grouping mechanisms for contour retrieval. The proposed approach is task-oriented because it embeds prior anatomical and procedural knowledge. From a practical standpoint, the proposed approach is of clinical relevance, since it allows for retrieving contours from images where the annotation is `blended' with the image content. While new image annotation systems are able to store image content and annotations in a separate manner, many TRUS prostate databases still contain `blended' annotations only. Our approach allows for contour retrieval and further 3D prostate modeling from such databases.