This paper is concerned with the problem of order reduction of a full-
order Kalman filter for a stable linear signal model so that the stead
y-state filtering error variance associated with the reduced order fil
ter is minimized. By an orthogonal parameterization, the above problem
is formulated to minimize the filtering error variance over a set of
orthogonal matrices. Both continuous and iterative algorithms are deri
ved to compute an optimal reduced-order filter. The algorithms are sho
wn to possess nice properties, including the desirable convergence pro
perty. The proposed algorithms are simple and effective. Numerical exa
mples are presented to demonstrate the effectiveness and the significa
nt advantages of the proposed algorithms over the existing open-loop m
ethods such as the well-known balanced truncation method.