The hippocampus is needed to store memories that are reconfigurable. Theref
ore, a hippocampal-like computational model should be able to solve transit
ive inference (TI) problems. By turning TI into a problem of sequence learn
ing (stimuli-decisions-outcome), a sequence learning, hippocampal-like neur
al network solves the TI problem. In the transitive inference problem studi
ed here, a network simulation begins by learning six pairwise relationships
: A > B, B > C, C > D, D > E, E > F, and F > G where the underlying relatio
nship is the linear string: A > B > C > D > E > F > G. The simulation is th
en tested with the novel pairs: B?D, C?E, D?F, B?E, C?F, B?F, and A?G. The
symbolic distance effect, found in animal and human experiments, is reprodu
ced by the network simulations. That is, the simulations give stronger deco
dings for B > F than for B > E or C > F and decodings for B > F and C > F a
re stronger than for B > D, C > E, or D > F. (C) 2001 Elsevier Science B.V.
All rights reserved.