The performance evaluation of an information retrieval system is a decisive aspect for the measure of the improvements in search technology. Our work intends to provide a framework to compare and contrast the performance of search engines in a research environment. In this way, we have designed and developed USim, a tool for the performance evaluation of Web IR systems based on the simulation of users' behavior. This simulation tool contributes in the performance evaluation process in two ways: estimating the saturation threshold of the system and in the comparison of different search algorithms or engines. The latter point is the most interesting because, as we demonstrated, the comparison using different workload environments will achieve more accurate results (avoiding erroneous conclusions derived from ideal environments). From a general point of view, USim intends to be an approximation to some new performance evaluation techniques specifically developed for the Internet sea...