Standard implementations of non-parametric classifiers have large comp
utational requirements. Parzen classifiers use the distances of an unk
nown vector to all N prototype samples, and consequently exhibit O(N)
behavior in both memory and time. We describe four techniques for expe
diting the nearest neighbor methods: replacing the linear search with
a new kd tree method, exhibiting approximately O(N-1/2) behavior; empl
oying an L-infinity instead of L-2 distance metric; using variance-ord
ered features; and rejecting prototypes by evaluating distances in low
dimensionality subspaces. We demonstrate that variance-ordered featur
es yield significant efficiency gains over the same features linearly
transformed to have uniform variance. We give results for a large OCR
problem, but note that the techniques expedite recognition for arbitra
ry applications. Three of four techniques preserve recognition accurac
y. (C) 1997 Pattern Recognition Society.