The problem of optimal data fusion in multiple detection systems is st
udied in the case where training examples are available, but no a prio
ri information is available about the probability distributions of err
ors committed by the individual detectors. Earlier solutions to this p
roblem require some knowledge of the error distributions of the detect
ors, for example, either in a parametric form or in a closed analytica
l form. Here we show that, given a sufficiently large training sample,
an optimal fusion rule can be implemented with an arbitrary level of
confidence. We first consider the classical cases of Bayesian rule and
Neyman-Pearson test for a system of independent detectors. Then we sh
ow a general result that any test function with a suitable Lipschitz p
roperty can be implemented with arbitrary precision, based on a traini
ng sample whose size is a function of the Lipschitz constant, number o
f parameters, and empirical measures. The general case subsumes the ca
ses of nonindependent and correlated detectors.