How long does it take to multiply very large integers? Using the algorithm you learned in elementary school, it takes O(*n*²) operations to multiply two *n* digit numbers. But for large enough numbers it pays to carry out multiplication very differently, using FFTs.

If you’re multiplying integers with tens of thousands of decimal digits, the most efficient algorithm is the Schönhage-Strassen algorithm, which takes

O(*n* log *n* log log *n*)

operations. For smaller numbers, another algorithm, such as Karatsuba’s algorithm, might be faster. And for impractically large numbers, Fürer’s algorithm is faster.

What is impractically large? Let’s say a number is impractically large if storing it requires more than a billion dollars worth of RAM. If I did my back-of-the-envelop math correctly, you can buy enough RAM to store about 2^{57} bits for about a billion dollars. The Schönhage-Strassen algorithm is more efficient than Fürer’s algorithm for numbers with less than 2^{64} bits.

**Update** (March 18, 2019): David Harvey and Joris Van Der Hoeven have proven that integer multiplication is

O(*n* log *n*)

They give two algorithms, a simpler algorithm that depends on an unproven conjecture in number theory, and a more complicated algorithm that does not require unproven assumptions. As with the algorithms mentioned above, the new algorithms are only an improvement for impractically large numbers and so the result is primarily of theoretical interest.

**Related post**: How fast can you multiply matrices?