This paper develops a theoretical basis for the minimization of chip a
rea required for fixed taper buffer design. It modifies the well-known
procedure for minimizing delay time in such circuits to derive a mini
mum number of required stages. Rather than minimize delay time, the pr
ocedure realizes a specified buffer delay time using a stage area scal
e factor that minimizes the total area of the buffer. Since an Integer
number of tapered stages must be used while the calculations lead to
noninteger results, the effects of roundoff errors are included.