Approximating the inference probability Pr[X = x \ E = e] in any sense
, even for a single evidence node E, is NP-hard. This result holds for
belief networks that are allowed to contain extreme conditional proba
bilities-that is, conditional probabilities arbitrarily close to 0. Ne
vertheless, all previous approximation algorithms have failed to appro
ximate efficiently many inferences, even for belief networks without e
xtreme conditional probabilities. We prove that we can approximate eff
iciently probabilistic inference in belief networks without extreme co
nditional probabilities. We construct a randomized approximation algor
ithm-the bounded-variance algorithm-that is a variant of the known lik
elihood-weighting algorithm. The bounded-variance algorithm is the fir
st algorithm with provably fast inference approximation on all belief
networks without extreme conditional probabilities. From the bounded-v
ariance algorithm, we construct a deterministic approximation algorith
m using current advances in the theory of pseudorandom generators. In
contrast to the exponential worst-case behavior of all previous determ
inistic approximations, the deterministic bounded-variance algorithm a
pproximates inference probabilities in worst-c:ase time that is subexp
onential 2((log n)d), for some integer d that is a linear function of
the depth of the belief network. (C) 1997 Elsevier Science B.V.