This paper deals with a situation of some importance for the analysis
of experimental data via Neural Network (NN) or similar devices: Let N
data be given, such that N = N-s + N-b, where N-s is the number of si
gnals, N-b the number of background events, and both are unknown. Assu
me that a NN has been trained, such that it will tag signals with effi
ciency F-s, (0 < F-s < 1) and background data with F-b (0 < F-b < 1).
Applying the NN yields N-Y tagged events, We demonstrate that the know
ledge of N-Y is sufficient to calculate confidence bounds for the sign
al likelihood, which have the same statistical interpretation as the C
lopper-Pearson bounds for the well-studied case of direct signal obser
vation. Subsequently, we discuss rigorous bounds for the a posteriori
distribution function of the signal probability, as well as for the (c
losely related) likelihood that there are N-s signals in the data. We
compare them with results obtained by starting off with a maximum entr
opy type assumption for the a priori likelihood that there are N-s sig
nals in the data and applying the Bayesian theorem. (C) 1997 Elsevier
Science B.V.