The present paper investigates the error committed by using an infinite tim
e horizon Markov renewal program as an approximation of the (often more rea
listic) Markov renewal program with a finite time horizon t(0). Under weak
assumptions the error is shown to converge to zero exponentially fast when
t(0) --> infinity. The convergence is based on explicit error bounds. Impro
ved error bounds hold when the (transformed) transition law has a nontrivia
l stochastic lower bound. Some bounds use the discounted renewal function.
For the latter, monotone upper and lower bounds are obtained by an iterativ
e method combined with an extrapolation. Several examples demonstrate the a
pplicability of the results.