Since the greatest common divisor (GCD) of two integers is a basic ari
thmetic operation used in many mathematical software systems, new algo
rithms for its computation are of widespread interest. The accelerated
integer GCD algorithm discussed here is based on a reduction step pro
posed by Sorenson (k-ary reduction), coupled with the dmod operation s
imilar to Norton's smod. Some practical limitations of Sorenson's redu
ction have been eliminated. Worst-case complexity is still O(n(2)) for
n-bit input, but actual implementations given input about 4096 bits l
ong perform over 5.5 times as fast as the binary GCD on one computer a
rchitecture having a multiply instruction. Independent research by Jeb
elean points to the same conclusions.