Research with neural networks typically ignores the role of knowledge in le
arning by initializing the network with random connection weights. We exami
ne a new extension of a well-known generative algorithm, cascade-correlatio
n. Ordinary cascade-correlation constructs its own network topology by recr
uiting new hidden units as needed to reduce network error. The extended alg
orithm, knowledge-based cascade-correlation (KBCC), recruits previously lea
rned sub-networks as well as single hidden units. This paper describes KBCC
and assesses its performance on a series of small, but clear problems invo
lving discrimination between two classes. The target class is distributed a
s a simple geometric figure. Relevant source knowledge consists of various
linear transformations of the target distribution. KBCC is observed to find
, adapt and use its relevant knowledge to speed learning significantly.