This paper reports a follow-on project that assessed a series of portfolios
assembled by a cohort of participants attending a course for prospective g
eneral practice trainers. In an attempt to enhance reliability, a framework
for defining and addressing problems using a reflective practice model was
offered to participants. The reliability of the judgements made by a panel
of assessors about individual 'components' together with an overall global
judgement about performance were studied. The reliability of individual as
sessors' judgements (i.e. their consistency) was moderate, but inter-rater
reliability did not reach a level that could support making a safe summativ
e judgement. Despite offering a possible structure for demonstrating reflec
tive processes, the levels of reliability reached were similar to the earli
er work and other subjective assessments generally, and perhaps reflected i
ndividuality of personal agendas of both the assessed and the assessors, an
d variations in portfolio structure and content; even agreement among the a
ssessors about evidence of the framework being used was poor. Suggestions f
or approaches in the future are made. The conclusion remains that while por
tfolios might be valuable as resources for learning, as assessment tools th
ey should be treated as problematic.