This paper introduces a novel approach to the problem of image inpainting through the use of nonlocal-means. In traditional inpainting techniques, only local information around the target regions are used to fill in the missing information, which is insufficient in many cases. More recent inpainting techniques based on the concept of exemplar-based synthesis utilize nonlocal information but in a very limited way. In the proposed algorithm, we use nonlocal image information from multiple samples within the image. The contribution of each sample to the reconstruction of a target pixel is determined using an weighted similarity function and aggregated to form the missing information. Experimental results show that the proposed method yields quantitative and qualitative improvements compared to the current exemplar-based approach. The proposed approach can also be integrated into existing exemplar-based inpainting techniques to provide improved visual quality.