During an individual's normal interaction with the environment and other hu
mans, visual and linguistic signals often coincide and can be integrated ve
ry quickly. This has been clearly demonstrated in recent eyetracking studie
s showing that visual perception constraints on-line comprehension of spoke
n language. In a modified visual search task, we found the inverse, that re
al-time language comprehension can also constrain visual perception. In sta
ndard visual search tasks, the number of distractors in the display strongl
y affects search time for a target defined by a conjunction of features, bu
t not for a target defined by a single feature. However, we found that when
a conjunction target was identified by a spoken instruction presented conc
urrently with the visual display, the incremental processing of spoken lang
uage allowed the search process to proceed in a manner considerably less af
fected by the number of distractors. These results suggest that perceptual
systems specialized for language and for vision interact more fluidly than
previously thought.