We study the effect of adding redundancy to an input stream on the losses t
hat occur due to buffer overflow. We consider several sessions that generat
e traffic into a finite capacity queue. Using multi-dimensional probability
generating functions, we derive analytical formulas for the loss probabili
ties and provide asymptotic analysis (for large n and small or large rho).
Our analysis allows us to investigate when does adding redundancy decrease
the loss probabilities. In many cases, redundancy is shown to degrade the p
erformance, as the gain in adding redundancy is not sufficient to compensat
e the additional losses due to the increased overhead. We show, however, th
at it is possible to decrease loss probabilities if a sufficiently large am
ount of redundancy is added. Indeed, we show that for an arbitrary stationa
ry ergodic input process, if rho < 1 then redundancy can reduce loss probab
ilities to an arbitrarily small value. (C) 1999 Elsevier Science B.V. All r
ights reserved.