This study examines the statistical Likelihood of detecting a trend in
annual runoff given an assumed change in mean annual runoff, the unde
rlying year-to-year variability in runoff, and serial correlation of a
nnual runoff. Means, standard deviations, and lag-1 serial correlation
s of annual runoff were computed for 585 stream gages in the contermin
ous United States, and these statistics were used to compute the proba
bility of detecting a prescribed trend in annual runoff. Assuming a Li
near 20% change in mean annual runoff over a 100 yr period and a signi
ficance level of 95%, the average probability of detecting a significa
nt trend was 28% among the 585 stream gages. The largest probability o
f detecting a trend was in the northwestern U.S., the Great Lakes regi
on, the northeastern U.S., the Appalachian Mountains, and parts of the
northern Rocky Mountains. The smallest probability of trend detection
was in the central and southwestern U.S., and in Florida. Low probabi
lities of trend detection were associated with low ratios of mean annu
al runoff to the standard deviation of annual runoff and with high lag
-1 serial correlation in the data.