This study was performed to test the usefulness of the EEG as a resear
ch instrument for music psychology in individuals. Measuring the degre
e of functional interrelatedness of brain areas by coherence estimates
has turned out to be more efficient than amplitude mapping. Therefore
, the method, based on the analysis of EEG periods of at least 1 min,
has been expanded to estimate all possible coherence values between th
e 19 electrodes (i.e., 171 values) and to observe any significant chan
ges in those values caused by different musical tasks. This report con
cerns observations in a total of 49 healthy subjects (29 male and 20 f
emale). The main goal of this study was to determine the degree of eng
agement of either hemisphere in the processing of music. Two items wer
e shown to indicate hemispheric involvement: (1) the topographic distr
ibution of ''focal points of coherence'' (brain areas participating in
coherence changes with respect to a great number of other brain areas
) and (2) the number of intrahemispheric coherence increases. In most
cases, both items seem to focus on the same hemisphere. Taking these a
s parameters for hemispheric engagement, the following principal obser
vations were made: the beta bands (and particularly their uppermost ra
nges) seem to play a major role in the processing of music; the hemisp
heric engagement, however, need not be the same for each frequency ban
d. No hemisphere seems to be preferred. When listening to music is shi
fted between different styles, laterality may change. When the same ta
sks are repeated at several weeks' intervals, a fairly large degree of
consistency is found. Imagining music and composing clearly differs f
rom listening by activating many more coherence increases in the beta
band and by an increasing percentage of hemispheric interaction. This
kind of analysis may also provide some clues as to how a piece of musi
c is processed by an individual. The coherence changes observed may re
present events taking place in a system of differential attention that
selects and orders the sensory inputs before the musical material is
further processed at higher order hierarchical levels.