The level density parameter for finite nuclei is studied in the framew
ork of the relativistic mean field theory. Systematic self-consistent
calculations are performed in the Thomas-Fermi approximation using sig
ma-omega models that include scalar meson self-couplings. For realisti
c nuclear matter properties, the level density parameter turns out to
be in the range of values obtained in non-relativistic calculations wi
th Skyrme interactions, and thus it is smaller than the global trend o
f the experimental data. The implications for the level density parame
ter of including vacuum fluctuations and exchange corrections in the m
ean field theory are also investigated.