In this paper we explore use of a new rate-distortion metric for optimizing real-time Internet video streaming with the transmission control protocol (TCP). The basic idea is to combat packet delays caused by TCP retransmissions that are essentially interpreted as errors by the streaming application. To this aim, we develop an analytical model of the expected video distortion at the decoder with respect to the expected latency for TCP packets, the channel state, and the error concealment method at the receiver. This metric is exploited with the design of a new algorithm for rate-distortion optimized encoding mode selection for video streaming with TCP (RDOMS-TCP). Real-time video streaming experiments show considerable improvement in PSNR in the range of 2 db over currently proposed TCP-based streaming mechanisms.