This paper presents a new method based on adaptive filtering theory for sup
erresolution restoration of continuous image sequences, The proposed method
ology suggests least squares (LS) estimators which adapt in time, based on
adaptive filters, least mean squares (LMS) or recursive least squares (RLS)
. The adaptation enables the treatment of linear space and time-variant blu
rring and arbitrary motion, both of them assumed known. The proposed new ap
proach is shown to be of relatively low computational requirements, Simulat
ions demonstrating the superresolution restoration algorithms are presented
.