We present a new approach to study relaxation towards equilibrium of l
attice model systems with non-conserved order parameter and spatial ho
mogeneity in time. The theory is mainly based on a simple scaling rela
tion for the relaxation time. This relation, when true, implies that t
he system time-probability distribution has a Gibbsian form when it is
near the final equilibrium state. We apply the theory to the study of
the q-state Potts model in one dimension. Besides, we present a Monte
Carlo computer-simulation study that provides a direct strong justifi
cation of the hypothesis and results of the theory.