This paper presents an empirical investigation of the recently proposed k-N
earest Centroid Neighbours (k-NCN) classification rule along with two heuri
stic modifications of it. These alternatives make use of both proximity and
geometrical distribution of the prototypes in the training set in order to
estimate the class label of a given sample. The experimental results show
that both alternatives give significantly better classification rates than
the k-Nearest Neighbours rule, basically due to the properties of the plain
k-NCN technique. (C) 1998 Published by Elsevier Science B.V. All rights re
served.