Although generally introduced to guard against human error, automated devic
es can fundamentally change how people approach their work, which in turn c
an lead to new and different kinds of error. The present study explored the
extent to which errors of omission (failures to respond to system irregula
rities or events because automated devices fail to detect or indicate them)
and commission (when people follow an automated directive despite contradi
ctory information from other more reliable sources of information because t
hey either fail to check or discount that information) can be reduced under
conditions of social accountability. Results indicated that making partici
pants accountable for either their overall performance or their decision ac
curacy led to lower rates of "automation bias". Errors of omission proved t
o be the result of cognitive vigilance decrements, whereas errors of commis
sion proved to be the result of a combination of a failure to take into acc
ount information and a belief in the superior judgement of automated aids.
(C) 2000 Academic Press.