ONE issue of continuing debate in language research concerns whether t
he brain holds separate representations for semantic information throu
gh the auditory us visual modalities. Regardless of whether we hear, s
ee or read meaningful information, our brains automatically activate b
oth auditory and visual semantic associations to the sensory input. Th
e prominent models for how the brain makes these cross-modality associ
ations holds that semantic information conveyed through either sensory
input modality is represented in a shared semantic system comprising
the traditionally identified language areas in the brain. A few recent
case reports as well as activation imaging studies, have challenged t
his notion by demonstrating category-specific organization within the
semantic system in spatially discrete brain regions. Neither view posi
ts a role for primary sensory cortices in semantic processing. We obta
ined positron emission tomographic (PET) images while subjects perform
ed an auditory responsive naming task, an auditory analog to visual ob
ject naming. Subjects heard and responded to descriptions of concrete
objects while blindfolded to prevent visual stimulation. Our results s
howed that, in addition to traditional language centers, auditory lang
uage input produced reciprocal activation in primary and secondary vis
ual brain regions, just as if the language stimuli had entered in the
visual modality. These findings provide evidence for a distributed sem
antic system in which sensory-specific semantic modules are mutually i
nteractive, operating directly onto early sensory processing centers.
NeuroReport 9: 2409-2413 (C) 1998 Rapid Science Ltd.