In this paper, we discuss the problem of active training data selection for
improving the generalization capability of a neural network. We look at th
e learning problem from a function approximation perspective and formalize
it as an inverse problem. Based on this framework, we analytically derive a
method of choosing a training data set optimized with respect to the Wiene
r optimization criterion. The final result uses the apriori correlation inf
ormation on the original function ensemble to devise an efficient sampling
scheme which, when used in conjunction with the learning scheme described h
ere, is shown to result in optimal generalization. This result is substanti
ated through a simulated example and a learning problem in high dimensional
function space.