Following Shannon we introduce higher order entropies and derive dynam
ic entropies. The nth order dynamic entropy (conditional entropy) is a
measure of the uncertainty of the next state which follows after the
observation of n foregoing states. The asymptotic behaviour of the dyn
amic entropies at large n is studied for several nonlinear model syste
ms and for symbolic sequences with long-range order (LRO). For example
we investigate 1D-maps, texts, DNA-strings and time series. It is sho
wn that the existence of long correlations improves the possibility of
predictions. Characteristic scaling laws for the higher order Shannon
entropies and the conditional entropies are derived and a new interpo
lation formula is tested. Finally instead of the dynamic entropies whi
ch yield mean values of the uncertainty/predictability we investigate
the local values of the uncertainty/predictability.