Talk:Toom–Cook multiplication

Untitled section
spotted some mistake in http://www.wikipedia.org/wiki/Toom3

c=ab=C1104w+C2103w+C3102w+C410w+C5 which can be represented as c=(C1,C2,C3,C4,C5)

while:

C1=A1B1 C2=A1B1+A2B1 <===***should be A1B2+A2B1 C3=A2B2     <===***should be A1B3+A2B2+A3B1

Where to report the mistake? here? Sheau Ching.

Yes, this is where to put mistakes.Vancouverguy 03:09, 2 Oct 2003 (UTC)


 * Or you could just make the change yourself. Check out How to edit a page and be bold! Angela 03:13, Oct 2, 2003 (UTC)

Proposal to rename / generalize article
I'd like to propose renaming this article to "Toom-Cook multiplication" and generalize it a bit. I've never renamed an article, and don't know what problems that might cause from external sites, so I'm not going to do it myself.

My main objection is that the Toom-Cook algorithm doesn't just specify a three-way split of the numbers; it's more general than that. The article makes it sound like three-way is the only way.

In addition, I believe that it should be noted that any computer implementation of this algorithm (and who's doing Toom-Cook by hand?) will use a power-of-two base, instead of the base 10 used in the article.--Eliasen 23:32, 3 Oct 2004 (UTC)

This article needs your help
This article desperately needs to be spruced up. I removed the broken example and added some (ok, a lot of) commentary.

Still need someone to write a working example for Toom-3.

The example
I'm trying to understand the example.

/85.228.39.223 15:58, 7 September 2007 (UTC)
 * "To compute c(x) = a(x) * b(x), we evaluate the two polynomials a and b in five points, and multiply the results, for our example we perform the following:"
 * How were the points in this example chosen, and why?
 * What does c(1/0) mean? It can't possibly mean c(x) where x is 1 divided by 0?
 * "Then solve the interpolation problem:"
 * What interpolation problem?
 * How is this solved by performing a number of operations?
 * What, exactly, are these operations anyway? Well, what they are isn't that hard to figure out, but why replace w3 with a value calculated from w3 and w1?  Why not create a new set of variables instead, to avoid confusion over which w3 is used?
 * "The result can be recomposed with:"
 * Right... Again, how were the order et c here chosen?  Obviously it depends on the choices made in the first and second steps, but how?
 * These are all great questions - the existing example was sparse and omitted a lot of detail about how the algorithm works. I hope the new explanation makes more sense - please take a look and let me know if there's anything you still don't understand. Dcoetzee 01:29, 14 October 2007 (UTC)

Evaluation and Interpolation Vandermonde Matrices
Why were the Vandermonde-like evaluation and interpolation matrices chosen to have ascending degree along the rows in the worked example? This is the opposite of the convention used by the Bodrato paper that is often referred to in the very same example. Was it more didactic this way? It makes it a bit confusing when reading the two. — Preceding unsigned comment added by 178.208.131.82 (talk) 20:59, 20 December 2011 (UTC)

Original research by 93.118.212.93 (Florin Matei), collapsed

 * ideas that might help abt that coetient that slows O -- possible emprovements abt solving the system of equations:

main creative idea is to use O(k) equations that can b written directly in the coetients that r we looking for practicaly use special pondered sum of products from the term spitting a to term spitting b where a n b are at least one 0 or k-1 (for k*k problem ), the terms coetients let them be 2^i or 2^(-i) we can observe that there is possibility of forming entirely a system of equation that works directly in desired coetients maybe this might b helfull to solve faster the equation system but idk how, at least not yet, sorry

93.118.212.93 (talk) 15:26, 2 February 2013 (UTC)

ok, i also think that basically aplying P(1) n P(-1) recurently might b helpfull to fast solve 4 k=2^h partitions... first we got the sums 4 odd/evn ranks coetients... then aply to odds n evns n so on till bottom coetients which r the desired ones: the evns 4 example is obtaines from split evn n evn and odds n odds: solving 4 k=2^h coetiens to b found in some O(k) or O(k*log(k)) operations with k groups of linear adds or subs .hoping this one is working 93.118.212.93 (talk) 19:33, 2 February 2013 (UTC)


 * Ideas that claim to b faster than Toom-Cook -- lets see:

assuming we have to multiply (A1;A2;A3)*(B1;B2;B3) we can plan this Karatsuba style like this: (A1;A2)*(B1;B2), (A3)*(B3) n (A1;(A2+A3))*(B1;(B2+B3))... it will b a mistake to first do the additions (A2+A3) n (B2+B3) so instead of computing "one mul" will compute "3 muls" but taking advantages of past computed value there is only one value to multiply (A1+A2+A3)*(B1+B2+B3), making a total of 5 muls just like Toom -Cook, but faster... n i think it might b generalised 4 n*n plan with 2*n-1 muls... Florin Matei, Romania 93.118.212.93 (talk) 11:03, 4 March 2013 (UTC)

the possible key of approaching this could b forcing B2=B3, one after another im starting to b not such pleased of my own, here, sorry :) 93.118.212.93 (talk) 11:53, 4 March 2013 (UTC)


 * ok, i think ill stay with idea of using 2^i as a factor:

well, we can c the polynomial form in many ways we can c sub-polynomial forms and i think, if we wanna get off with that coetient that slows the algo, i think its better to stay with x0=2 point of interpolation but 4 more than one sub-polynomial form, finding practically in linear time the whole group of wanted coetients ... ok ;)) 93.118.212.93 (talk) 16:17, 7 March 2013 (UTC)

ok, the Romanian cowboy is just suggesting that we could use more than 2*(n)-1 muls, evn 4*n muls n keeping a decent sampling point like x0=1 or x0=2 in order to make linear the solving of system of equations... in case x0=1 we might not evn need recursivity... n alright, its abt sub-polynomial form either... the rest is Good Luck 2 You, yankees!! LOL 93.118.212.93 (talk) 12:37, 9 March 2013 (UTC)

considerring we have 2 long integer 4 multiplication, long like at least 100 Bytes each, we first divide the two integers in M parts each forming then products of first K termes summed (multiplied with the corespondent from the second number sum of first k coetients) in the computed product appears the sum of the first k desired coetients of the result n another term that can b kept in some recurent loop, in order to successfully compute the disered sum of first k coetients of the result... it works either 4 first k or the last k so we could compute all mul result coetients its O(n) multiplied just by a constant that is <10 lets say

hi, i think that for the general k*k Toom-Cook approach we might use x0=2 point of interpolation 4 the following idea: lets say we tipically interpolate some trunced first polynomial with the trunced coresponding polynomial of its own. Practicaly, from those K and k coetients we tipicaly keep first m and m, m<=k. i think we can say that after doing thoese polynomials product for the x0=2 interpolation point, the terms obtained can b spilt in the group of first most significant powers m group n the rest: we r interested in this first to create the triangular system of equations... the remained m (m as for complexity order) terms need to b excluded n we might use a queue of memorized results for that, able to b maintained in some recurent semanthic loop to fulfill algo needs to select first m terms... we might also need to consider this discussion 4 the last m trunced coetients from those k, etc, in order to obtain 2*k-1 equations needed for the triangular system of linear equations... it is expected nearle O(N) complexity in time, n additional O(N) mem used :) Florin Matei, Romania 93.118.212.93 (talk) 03:20, 9 May 2013 (UTC)
 * Toom - Cook idea and some triangular system of linear equations, easy to solve:

Toom-Cook using derivatives
If you use the derivative p'(0) instead of p(-2) you get a simpler matrix inversion, for the price of reduction to 6 multiplications instead of 5. This is, because the derivative of a product needs the computation of two products: $$r'(0) = p'(0) \cdot q(0) + p(0) \cdot q'(0)$$. Nonetheless this seems to be a reasonable approach to me. Does someone know whether and where this was explored? HenningThielemann (talk) 10:33, 11 August 2013 (UTC)
 * I haven't worked out what you're up to but I don't see the point. The whole point of Toom-Cook is to cut down the number of multiplications. The constants divisions are much faster. Anyway Karatsuba algorithm already reduces four multiplications to three. Dmcq (talk) 12:59, 11 August 2013 (UTC)

What would "Toom-1" actually be?
IMO, Toom-1.5 is some kind of long multiplication, except that it doesn't immediately subdivide matters into into MxN elementary multiplications. Instead, it cuts the problem in half. Still, close enough.

Toom-2 is basically a systematic way to derive Karatsuba's algorithm, instead of matching coefficients by brute force, which would get impractical beyond 3 even using computers. No problem there, either.

Toom-3 is of course the logical extension of Toom-2, useful for bigger numbers of approximately the same length, and Toom-2.5 is a version optimized for numbers where one is significantly longer than the other but less than twice as long.

But "Toom-1" looks like a purely theoretical extension of Toom's idea to me: it doesn't cut the problem down, but merely replaces it with itself. Unlike the others, it doesn't form a useful algorithm, but would result in infinite recursion if implemented. Yet, section 2.1 claims that it "degenerates to long multiplication" — which a viable algorithm surely would, but only because such algorithm wouldn't resort to actual Toom-1 for any input.

Which means that "Toom-1" is like 1 in comparison to primes; unlike the rest, it doesn't reduce a problem any further: running Toom-1 is a non-operation just like dividing an integer by 1. In both cases, any input is a fixed point.

84.148.246.92 (talk) 21:53, 7 October 2020 (UTC)


 * I concur. The claims about “Toom-1” in the article do not really make sense.—Emil J. 08:08, 14 June 2024 (UTC)