In this work a visual-based autonomous system capable of memorizing an
d recalling sensory-motor associations is presented. The robot's behav
iors are based on learned associations between its sensory inputs and
its motor actions. Perception is divided into two stages. The first on
e is functional: algorithmic procedures extract in real time visual fe
atures such as disparity and local orientation from the input images.
The second stage is mnemonic: the features produced by the different f
unctional areas are integrated with motor information and memorized or
recalled. An efficient memory organization and fast information retri
eval enables the robot to learn to navigate and to avoid obstacles wit
hout need of an internal metric reconstruction of the external environ
ment.