Motivated by recent work of Joe (1989, Ann. Inst. Statist. Math., 41,
683-697), we introduce estimators of entropy and describe their proper
ties. We study the effects of tail behaviour, distribution smoothness
and dimensionality on convergence properties. In particular, we argue
that root-n consistency of entropy estimation requires appropriate ass
umptions about each of these three features. Our estimators are differ
ent from Joe's, and may be computed without numerical integration, but
it can be shown that the same interaction of tail behaviour, smoothne
ss and dimensionality also determines the convergence rate of Joe's es
timator. We study both histogram and kernel estimators of entropy, and
in each case suggest empirical methods for choosing the smoothing par
ameter.