We introduce a method of characterizing the complexity of the minima o
f a given function, of N variables, by means of the entropy, S(N), of
a measure selected as follows: One chooses an algorithm which computes
the minima, and assigns to a single minimum a probability equal to th
e (relative) measure of its basin of attraction. The behavior of S(N),
for large N, gives a well defined entropy density sigma for the minim
a, with respect to the chosen minima-generating law. sigma characteriz
es the complexity of the minima in a rather natural way in the framewo
rk of information theory, i.e., the number of bits one uses to transmi
t one (typical) minimum configuration is similar or equal to N sigma/l
n2.