This article presents a new incremental learning algorithm for classif
ication tasks, called NetLines, which is well adapted for both binary
and real-valued input patterns. It generates small, compact feedforwar
d neural networks with one hidden layer of binary units and binary out
put units. A convergence theorem ensures that solutions with a finite
number of hidden units exist for both binary and real-valued input pat
terns. An implementation for problems with more than two classes, vali
d for any binary classifier, is proposed. The generalization error and
the size of the resulting networks are compared to the best published
results on well-known classification benchmarks. Early stopping is sh
own to decrease overfitting, without improving the generalization perf
ormance.