Traditionally, computational complexity has considered only static pro
blems. Classical complexity classes such as NC, P, and NP are defined
in terms of the complexity of checking-upon presentation of an entire
input-whether the input satisfies a certain property. For many applica
tions of computers it is more appropriate to model the process as a dy
namic one. There is a fairly large object being worked on over a perio
d of time. The object is repeatedly modified by users and computations
are performed. We develop a theory of dynamic complexity. We study th
e new complexity class, dynamic first-order logic ( Dyn-FO). This is t
he set of properties that can be maintained and queried in first-order
logic, i.e., relational calculus, on a relational database. We show t
hat many interesting properties are in Dyn-FO, including multiplicatio
n, graph connectivity, bipartiteness. and the computation of minimum s
panning trees. Note that none of these problems is in static FO, and t
his fact has been used to justify increasing the power of query langua
ges beyond first-order. It is thus striking that these problems are in
deed dynamic first-order and, thus, were computable in first-order dat
abase languages all along. We also define ''bounded-expansion reductio
ns'' which honor dynamic complexity classes. We prove that certain sta
ndard complete problems for static complexity classes, such as REACH(a
) for P, remain complete via these new reductions. On the other hand,
we prove that other such problems, including REACH for NL and REACH(d)
for L, are no longer complete via bounded-expansion reductions. Furth
ermore, we show that a version of REACH,, called REACH(a)(+), is not i
n Dyn-FO unless all of P is contained in parallel linear time. (C) 199
7 Academic Press.