The objective structured clinical examination (OSCE) now has an establ
ished place in the assessment of the medical undergraduate. While much
has been written about the reliability of the OSCE, empirical work on
the determination of the passing score which represents competence on
the OSCE is rarely encountered. If the OSCE is to play its role in th
e 'high stakes' testing of clinical competence, it is important that t
his passing score be set reliably and defensibly. This article illustr
ates how a two-session modified Angoff standard-setting procedure is u
sed to set the passing score on a 14 station Obstetrics and Gynaecolog
y OSCE used to assess final year students at The Queen's University of
Belfast. The Angoff methodology harnesses the professional judgement
of expert judges to establish defensible standards. Four university te
achers, five non-academic consultants and six junior clinical staff to
ok part in a two-session Angoff standard-setting procedure. In the fir
st session, the judges (individually and in silence) used their profes
sional judgement to estimate the score which a minimally competent fin
al year obstetrics and gynaecology student should achieve on each rest
ed element of the OSCE. In the second session they revised their sessi
on 1 judgements in the light of the OSCE scores of real students and t
he opportunity for structured discussion. The passing score for the OS
CE is reported together with the statistical measures which assure its
reliability.