Univariate decision trees at each decision node consider the value of only
one feature leading to axis-aligned splits. In a linear multivariate decisi
on tree, each decision node divides the input space into two with a hyperpl
ane. In a nonlinear multivariate tree, a multilayer perceptron at each node
divides the input space arbitrarily, at the expense of increased complexit
y and higher risk of overfitting. We propose omnivariate trees where the de
cision node may be univariate, linear, or nonlinear depending on the outcom
e of comparative statistical tests on accuracy thus matching automatically
the complexity of the node with the subproblem defined by the data reaching
that node. Such an architecture frees the designer from choosing the appro
priate node type, doing model selection automatically at each node. Our sim
ulation results indicate that such a decision tree induction method general
izes better than trees with the same types of nodes everywhere and induces
small trees.