Reducing energy losses in power distribution systems is important. Besides
reduced energy losses, the economic benefits resulting from loss minimizati
on include released generation, transmission and substation capacity, and d
eferral or elimination of system expansion. Reducing losses also leads to r
educed feeder voltage drop and consequently improved voltage regulation. On
e method of minimizing losses is to use existing switches to reconfigure th
e distribution network to achieve minimum losses. However, the problem is a
nonlinear programming problem that presents a heavy computational burden f
or even a moderately-sized distribution system. This paper investigates the
application of evolution strategies to the problem of distribution feeder
reconfiguration for loss minimization. The principles of evolution strategi
es are discussed, and a model proposed. The success of the method is demons
trated through computer simulations.