The problem addressed here is related to the minimization of the total loss probability in series of finite queues at which customers are rejected if the waiting capacity is exceeded. More precisely, one is concerned with the question of determining whether or not there may exist conditions under which an increase of the loss rate at one queue, e.g. at the most upstream one, could result in a decrease of the total loss rate throughout the whole network. The answer obtained in the context of a simple model is negative.