Denitrification (DN) has an important role in models simulating soil n
itrate leaching, because denitrification losses influence the soil N b
udget, hence the amount of nitrate available for leaching. In spite of
the importance of DN estimate, the behavior of submodels simulating D
N losses is not well known. The aim of this study was to improve the k
nowledge of the behavior of submodels simulating DN losses in selected
crop growth/N-leaching models. To remove the interference of the othe
r processes on the results, we isolated the algorithms predicting the
denitrification rate (DNR) and applied them to three data sets selecte
d from the available literature. Denitrification subroutines were take
n from the models EPIC, CropSyst, CREAMS, GLEAMS, CERES-N, and NLEAP.
A preliminary sensitivity analysis emphasized how quite similar algori
thms can give rise to considerably different predictions. The measured
vs. predicted values comparison confirmed DNR to be remarkably affect
ed by the soil water content (SW) relationship with soil water values
at field capacity and/or at saturation. CERES-N, CREAMS, GLEAMS, and E
PIC submodels underestimated even up to 100% the number of occasions w
here DNR > 0, since they simulated the DN process only when SW values
were higher than SW at field capacity, whereas in real systems DN occu
rs even at lower SW values. Results suggest that DN modeling should ta
ke into greater account the contribution to DN losses during aerobic c
onditions. The large natural variation of DNR measurements invalidates
the use of statistical criteria for the comparison of measured vs. es
timated DN values in deterministic models.