It is well known that prior knowledge or bias can speed up learning, a
t least in theory. It has proved difficult to make constructive use of
prior knowledge, so that approximately correct hypotheses can be lear
ned efficiently. In this paper, we consider a particular form of bias
which consists of a set of ''determinations.'' A set of attributes is
said to determine a given attribute if the latter is purely a function
of the former. The bias is tree-structured if there is a tree of attr
ibutes such that the attribute at any node is determined by its childr
en, where the leaves correspond to input attributes and the root corre
sponds to the target attribute for the learning problem. The set of al
lowed functions at each node is called the basis. The tree-structured
bias restricts the target functions to those representable by a read-o
nce formula (a Boolean formula in which each variable occurs at most o
nce) of a given structure over the basis functions. We show that effic
ient learning is possible using a given tree-structured bias from rand
om examples and membership queries, provided that the basis class itse
lf is learnable and obeys some mild closure conditions. The algorithm
uses a form of controlled experimentation in order to learn each part
of the overall function, fixing the inputs to the other parts of the f
unction at appropriate values. We present empirical results showing th
at when a tree-structured bias is available, our method significantly
improves upon knowledge-free induction. We also show that there are ha
rd cryptographic limitations to generalizing these positive results to
structured determinations in the form of a directed acyclic graph.