Test-retest reliability of rotational chair testing for a single facil
ity has previously been examined by others. The actual data analysis m
ethods, however, have received far less attention. The variety of both
hardware and software currently used theoretically may affect the res
ults for a given subject tested at different facilities. The purposes
of this study were, first, to quantify the amount of variability in th
e analysis of identical raw data files at multiple rotational chair te
sting facilities by using automated analysis; second, to evaluate the
effect of operator intervention on the analysis; and third, to identif
y possible sources of variability. Raw data were collected from 10 nor
mal subjects at 0.05 Hz and 0.5 Hz (50 degrees per second peak velocit
y). Diskettes containing raw electro-oculogram data files were then di
stributed to eight participating laboratories for analysis by two meth
ods: (I) using automated analysis algorithms and (2) using the same al
gorithms but allowing operator intervention into the analysis. Respons
e parameters calculated were gain and phase (re: velocity). The SD of
gain values per subject for automated analysis ranged from 0.01 to 0.3
2 gain units and of phase values from 0.4 to 13.7 degrees. For analysi
s with operator intervention, the SD of gain values ranged from 0.02 t
o 0.10 gain units and of phase values from 0.4 to 4.4 degrees. The dif
ference between automated analysis and analysis with operator interven
tion was significant for gain calculations (p < 0.02) but not for phas
e calculations (p > 0.05). This study demonstrates significant variabi
lity in automated analysis of rotational chair raw data for gain and p
hase. Operator intervention into the analysis significantly reduces va
riability for gain but not for phase.