The recursive optimal per-pixel estimate (ROPE) is an effective end-to-end distortion estimation scheme. Most existing ROPE-based applications assume that: (i) the encoder knows exactly the actual packet loss rate and (ii) the decoder error concealment scheme; (iii) no deblocking in-loop filtering is employed. However, in practice, these assumptions may not all be valid. In this paper, we investigate the impact of mismatch between assumed and actual conditions on the performance of ROPE and corresponding rate-distortion optimized coding mode selection. Useful conclusions are drawn from extensive experimental results.