Dynamic power management schemes (also called policies) reduce the power co
nsumption of complex electronic systems by trading off performance for powe
r in a controlled fashion, taking system workload into account. In a power-
managed system it is possible to set components into different states, each
characterized by performance and power consumption levels. The main functi
on of a power management policy is to decide when to perform component stat
e transitions and which transition should be performed, depending on system
history, workload, and performance constraints.
In the past, power management policies have been formulated heuristically,
The main contribution of this paper Is to introduce a finite-state, abstrac
t system model for power-managed systems based on Markov decision processes
. Under this model, the problem of finding policies that optimally tradeoff
performance for power can be cast as a stochastic optimization problem and
solved exactly and efficiently, The applicability and generality of the ap
proach are assessed by formulating Markov model and optimizing power manage
ment policies for several systems.