This paper sets out to illustrate the importance of transparency withi
n software support systems and in particular for those intelligent ass
istant systems performing complex industrial design tasks. Such transp
arency (with the meaning 'clear' or 'easy to understand') can be achie
ved by two distinct strategies that complement each other: (i) The des
ign of intelligible systems that would avoid the need for in depth exp
lanation. (ii) The flexible generation of those definitions or aspects
of the system or domain that remain ambiguous. The paper illustrates
that for the generation of useful explanations going beyond a simple j
ustification of a problem solving trace, specific explanatory knowledg
e must be acquired. By itself the problem solving techniques are not s
ufficient. A new approach to acquire and model explanatory knowledge f
or software systems is presented. The new four-layer explanatory model
can be used to determine the range of explanation suitable for a give
n systems domain. This model has been successfully used for the develo
pment of an explanation component for the design assistant system ASSI
ST that supports factory layout planning, in itself a complex design t
ask. (C) 1997 Elsevier Science Limited.