This paper examines the challenges involved in conducting an informal usabi
lity case study based on the introduction of a new information retrieval sy
stem to experienced users. We present a summary of activities performed dur
ing two iterations of usability testing and describe our analysis methodolo
gy. This methodology incorporates several grouping and prioritizing methods
which provide one of the major contributions of the work. During the cours
e of the case study, we learned some valuable lessons which were specific t
o the Text REtrieval Conference (TREC). The TREC-specific lessons learned l
ed to recommendations for changes in the TREC topic development and assessm
ent tasks. Results of the study include lessons learned about both the user
s and the testing techniques (Hoffman & Downey, 1997). (C) 1999 Published b
y Elsevier Science Ltd. All rights reserved.