By the onset of reaching, young infants are already: able to keep track of
the position of their hand by using visual feedback from the target and pro
prioceptive feedback from the arm, How is this multimodal coordination achi
eved? We propose that infants learn to coordinate vision and proprioception
by using tactile feedback from the target. In order to evaluate this hypot
hesis? we employ an evolutionary-based learning algorithm as a proxy for tr
ial-and-error sensorimotor development in young infants. A series of simula
tion studies illustrate how touch: I) helps coordinate vision and proprioce
ption; 2) facilitates an efficient reaching strategy; and 3) promotes inter
modal recalibration when the coordination is perturbed. We present two deve
lopmental predictions generated by the model and discuss the relative impor
tance of visual and tactile feedback while learning to reach.