In this paper we present an experimental study conducted in 802.11based mesh networks of three existing rate adaptation algorithms. The aim of this study is twofold. On the one hand, we explore the ability of these algorithms to cope with moderate to high medium contention levels. On the other hand, we investigate their performance on medium-distance 802.11 links. Our study indicates that, in congested networks, the network throughput can degrade up to ten times with respect to the best performance if the rate decision process is based solely on frame loss rates, without differentiating between the various causes of losses (i.e., channel errors or collisions). In addition, we have shown that these rate adaptation strategies perform reasonably well when the time correlation between channel errors is at least of the order of the sampling period used to estimate the channel dynamics. We believe that this study can be useful to derive correct guidelines for the design of new optimized rat...