The tail index of a density has been widely used as an indicator of th
e probability of getting a large deviation in a random variable. Most
of the theory underlying popular estimators of it assume that the data
are independently and identically distributed (i.i.d.). However, many
recent applications of the estimator have been to financial data, and
such data tend to exhibit long-range dependence. We show, via Monte C
arlo simulations, that conventional measures of the precision of the e
stimator, which are based on the i.i.d, assumption, are greatly exagge
rated when such dependent data are used. This conclusion also has impl
ications for estimates of the likelihood of getting some extreme value
s, and we illustrate the changed conclusions one would get using equit
y return data.