Image matching is a fundamental task for many applications of computer vision. Today it is very popular to represent two matched images as two bags of local descriptors, and the classic RANSAC based matching procedure is always exploited in the task. In this paper, we present a much efficient image matching approach based on sets of any local descriptors. A block-to-block strategy is devised to speed up the establishment of local correspondences. Additionally, the weighted RANSAC (w-RANSAC) technique is proposed to make the search of optimal global models converge faster. Comparative experiments with the RANSAC based paradigm show our approach can not only generate more accurate correspondences, but also double the matching speed.