In this paper, we present a model for TCP/IP ow control mechanism. The rate at which data is transmitted increases linearly in time until a packet loss is detected. At that point, the transmission rate is divided by a constant factor. Losses are generated by some exogenous random process which is only assumed to be stationary. This allows us to account for any correlation and any distribution of inter-loss times. We obtain an explicit expression for the throughput of a TCP connection and bounds on the throughput when there is a limit on the congestion window size. In addition, we study the e ect of the TimeOut mechanism on the throughput. A set of experiments is conducted over the real Internet and a comparison is provided with other models which make simple assumptions on the inter-loss time process.