The fixed slope lossy algorithm derived from the kth-order adaptive arithme
tic codeword length function is extended to the case of finite-state decode
rs or trellis-structured decoders. It is shown that when this algorithm is
used to encode a stationary, ergodic source with a continuous alphabet, the
Lagrangian performance (i.e., the resulting compression rate plus lambda t
imes the resulting distortion) converges with probability one to a quantity
computable as the infimum of an information-theoretic functional over a se
t of auxiliary random variables and reproduction levels, where lambda > 0 a
nd -lambda is designated to be the slope of the rate-distortion function R(
D) of the source at some D; the quantity is close to R(D) + lambda D when t
he order k used in the arithmetic coding or the number of states in the dec
oders is large enough, An alternating minimization algorithm for computing
the quantity is presented; this algorithm is based on a training sequence a
nd in turn gives rise to a design algorithm for variable-rate trellis sourc
e codes, The resulting variable-rate trellis source codes are very efficien
t in low-rate regions (below 0.8 bits/sample), With k = 8, the mean-squared
error encoding performance at the rate 1/2 bits/sample for memoryless Gaus
sian sources is comparable to that afforded by trellis-coded quantizers; wi
th k = 8 and the number of states in the decoder = 32, the mean-squared err
or encoding performance at the rate 1/2 bits/sample for memoryless Laplacia
n sources is about 1 dB better than that afforded by the trellis-coded quan
tizers with 256 states, With k = 8 and the number of states in the decoder
= 256, the mean-squared error encoding performance at the rates of a fracti
on of 1 bit/sample for highly dependent Gauss-Markov sources with correlati
on coefficient 0.9 is within about 0.6 dB of the distortion-rate function.
Note that at such low rates, predictive coders usually perform poorly.