In this paper, eve present a model for TCP/IP how control mechanism. The ra
te at which data is transmitted increases linearly in time until a packet l
oss is detected. At that point, the transmission rate is divided by a const
ant factor. Losses are generated by some exogenous random process which is
only assumed to be stationary. This allows us to account for any correlatio
n and any distribution of inter-loss times. We obtain an explicit expressio
n for the throughput of a TCP connection and bounds on the throughput when
there is a limit on the congestion window size. Tn addition, we study the e
ffect,, of the TimeOut mechanism on the throughput. A set of experiments is
conducted over the real Internet and a comparison is provided with other m
odels which make simple assumptions on the inter-loss time process.