The purpose of this paper is to study how resequencing packets in a communi
cation network may affect the performance of the application needing the in
formation. We analyze a simple queueing model where customers may be random
ly disordered, before arriving at a single server queue. The performance in
dex chosen is the variance of the server waiting time, which is a measure o
f the 'jitter' suffered by the application. The analysis reveals that not r
esequencing customers does improve the service regularity for a wide range
of loss probabilities and service times. It shows also that, contrary to th
e conclusions of previous analyses, resequencing does not necessarily make
the performance of the system worse. (C) 1999 Published by Elsevier Science
B.V. All rights reserved.