This paper reviews recent advances in supervised learning with a focus on t
wo most important issues: performance and efficiency. Performance addresses
the generalization capability of a learning machine on randomly chosen sam
ples that are not included in a training set. Efficiency deals with the com
plexity of a learning machine in both space and time. As these two issues a
re general to various learning machines nod learning approaches, we focus o
n a special type of adaptive learning systems with a neural architecture. W
e discuss four types of learning approaches. training all individual model;
combinations of several well-trained models; combinations of many weak mod
els; and evolutionary computation of models. We explore advantages and weak
nesses of each approach and their inter relations, and we pose open questio
ns for possible future research.