This article presents a method aiming at quantifying the visual similarity between two images. This kind of problem is recurrent in many applications such as object recognition, image classification, etc. In this paper, we propose to use self-organizing feature maps (SOM) to measure image similarity. To reach this goal, we feed local signatures associated to salient patches into the neural network. At the end of the learning step, each neural unit is tuned to a particular local signature prototype. During the recognition step, each image presented to the network generates a neural map that can be represented by an activity histogram. Image similarity is then computed by a quadratic distance between histograms. This scheme offers very promising results for image classification with a percentage of 84.47% of correct classification rates.