Accurate detection of both maximum and minimum circuit delay times and
the detection of input vectors that produce those delays are crucial
tasks in the design and testing of high speed CMOS circuits. This is e
specially true for timing disciplines, such as single phase latching,
wave pipelining, and asynchronous design, where combinational logic de
lays paths are designed to be nearly equal and timing constraints are
very tight. For these design methodologies, traditional timing analysi
s based on gate delay models assuming single delay values for gates or
delay values based only on gate inputs is not sufficient. For example
, two input CMOS NAND gate delay can vary by as much as a factor of tw
o based on, whether one input is changing or both inputs are changing.
This implies that, for accurate detection of maximum and minimum over
all delay, sensitization of multiple paths must be considered to ascer
tain feasibility of multiple simultaneous input transitions at particu
lar gates. This paper considers this problem beginning with a brief di
scussion of CMOS gate delays. An algorithm is then presented that accu
rately detects both maximum and minimum delays considering the effect
of delay differences due to rising and falling transitions and due to
single vs. multiple simultaneous signal changes at gate inputs. Result
s of this process are demonstrated in a prototype timing analyzer XTV.