It is well established that all kinds of visual attributes are processed se
parately within the brain. This separation is related to differences in the
information that is relevant for the different attributes. When attributes
differ greatly (such as colour and motion) it is obvious that they must re
ly on different information. However, separating the processing of differen
t attributes could also allow highly related attributes to evolve independe
ntly, so that they end up being judged on the basis of different types of i
nformation. Here, we examine the case of egocentric and relative localisati
on. For judging egocentric positions, the orientation of the eyes has to be
taken into account. This is not so for judging relative positions. We demo
nstrate that these two attributes can be processed separately by showing th
at simultaneous judgements of relative and egocentric position differ in th
eir dependency on eye orientation. Subjects pursued a moving dot. We Bashed
either single targets, or pairs of targets with a 67 ms interval between t
hem, directly below the subjects' gaze. As the eyes were moving during the
67 ms interval, the retinal separation between pairs of targets was differe
nt from their actual separation. Subjects indicated the position at which t
hey saw the targets with reasonable reproducibility, with a consistent bias
in the direction of the eye movement. However, when two targets were flash
ed, the indicated separation between them usually coincided with their reti
nal separation, rather than with their actual separation. We conclude that
egocentric and relative spatial positions can be estimated separately and s
imultaneously, on the basis of different types of information. (C) 2000 Els
evier Science Ltd. All rights reserved.