Many electric utilities use small reservoirs in mountainous regions to
generate hydropower to meet peak energy demands. Water input depends
on the water budget of the catchment, whereas output depends on user d
emand, which is influenced by temperature. Hence reservoir performance
depends on climatic factors and is sensitive to climate change. A mod
el, based on the systems of Duke Power and Virginia Power in the south
-eastern USA, was developed to simulate performance. The annual maximu
m draw-down of the reservoir, which represents the minimum dam size ne
eded to maintain continuous energy generation, is considered here. The
model was tested for four regions in the eastern USA using 1951-1995
observations. The amount of draw-down depended on the linked daily seq
uences of precipitation and temperature, the former dictating the wate
r available, the latter influencing both evaporation and energy demand
. The time and level of the annual extreme emphasized that small chang
es in the timing of a dry spell had a major impact on the draw-down. C
limatic changes were simulated by uniformly increasing temperatures by
2 degrees C and decreasing precipitation by 10 per cent. The resultan
t draw-down increased from current simulated values by about 10 per ce
nt to 15 per cent with extremes up to 50 per cent. This was of the sam
e order, but in the opposite direction, as the change created by a 10
per cent increase in the efficiency of energy generation. Without such
an efficiency increase, many utilities will face the prospect of redu
ced or less reliable hydroelectric generation if climate changes in th
e manner examined here. (C) 1997 by the Royal Meteorological Society.