This article shows how to smoothly "monotonize" standard kernel estimators
of hazard rate, using bootstrap weights. Our method takes a variety of form
s, depending on choice of kernel estimator and on the distance function use
d to define a certain constrained optimization problem. We confine attentio
n to a particularly simple kernel approach and explore a range of distance
functions. It is straightforward to reduce "quadratic" inequality constrain
ts to "linear" equality constraints, and so our method may be implemented u
sing little more than conventional Newton-Raphson iteration. Thus, the nece
ssary computational techniques are very familiar to statisticians. We show
both numerically and theoretically that monotonicity, in either direction,
can generally be imposed on a kernel hazard rate estimator regardless of th
e monotonicity or otherwise of the true hazard rate. The case of censored d
ata is easily accommodated. Our methods have straightforward extension to t
he problem of testing for monotonicity of hazard rate, where the distance f
unction plays the role of a test statistic.