Implement, on your favorite computer system, one of the
following three algorithms: (a) the Karatsuba-Ofman algorithm
for integer multiplication, (b) Strassen's matrix multiplication
algorithm, (c) the deterministic linear-time median algorithm.
Compare the running time of your
program to some standard method for the same problem.
How large must the input values be before the asymptotically
faster algorithms supercedes the standard method?
Hint for case (a): one ``standard'' method for
long-integer multiplication is the following algorithm:
multiply(x,y);
{ ;
y > 0
{ ;
;
};
(z)
}