In the construction of a Bayesian network, it is always assumed that t
he variables starting from the same parent are conditionally independe
nt. In practice, this assumption may not hold, and will give rise to i
ncorrect inferences. In cases where some dependency is found between v
ariables, we propose that the creation of a hidden node, which in effe
ct models the dependency, can solve the problem. In order to determine
the conditional probability matrices for the hidden node, we use a gr
adient descent method. The objective function to be minimised is the s
quared-error between the measured and computed values of the instantia
ted nodes. Both forward and backward propagation are used to compute t
he node probabilities. The error gradients can be treated as updating
messages and can be propagated in any direction throughout any singly
connected network. We used the simplest node-by-node creation approach
for parents with more than two children. We tested our approach on tw
o different networks in an endoscope guidance system and, in both case
s, demonstrated improved results.