The latest blog post from Gödel’s Lost Letter and P=NP looks at the problem of finding relatively prime pairs of large numbers. In particular, they want a deterministic algorithm. They mention in passing that the probability of a set of *k* large integers being relatively prime (coprime) is 1/ζ(*k*) where ζ is the Riemann zeta function. This blog post will look closer at that statement.

## How large is large?

What does it mean that the numbers are large? That they are large enough that the asymptotic result given by the zeta function is accurate enough. We’ll explore this with simulation code shortly.

The exact statement is that in the limit as *N* goes to infinity, the probability of *k* integers chosen uniformly from 1 to *N* being coprime converges to 1/ζ(*k*). I won’t go into the proof here, but in a nutshell, it falls out of Euler’s product formula for the zeta function.

## What does it mean for a set of numbers to be coprime?

The probability above seemed wrong to me at first. The function ζ(*k*) decreases as *k* increases, and so its reciprocal increases. That means it’s more likely that a set of numbers is coprime as the size of the set increases. But the larger the set of numbers, the more likely it is that two will have a common factor, so shouldn’t the probability that they’re coprime go down with *k*, not up?

The resolution to the apparent contradiction is that I had the wrong idea of coprime in mind. Two integers are coprime if they have no prime factors in common. How do you generalize that to more than two integers? You could define a set of numbers to be coprime if every pair of numbers in the set is coprime. That’s the definition I had in mind. But that’s not the standard definition. The right definition here is that the greatest common divisor of the set is 1. For example, (6, 14, 21) would be coprime because no single prime divides all three numbers, even though no pair of numbers from the set is coprime.

## Python simulation

Let’s write some Python code to try this out. We’ll randomly generate some sets of large numbers and see how often they’re coprime.

How do you generate large random integers in Python? You can use the function `getrandbits`

to generate numbers as large as you like. In the code below we’ll generate 128-bit integers.

How do you compute the greatest common divisor in Python? There’s a library function `gcd`

, but it only takes two integers. We’ll use the `reduce`

function to fold `gcd`

over a list of integers.

How do you compute the zeta function in Python? SciPy doesn’t have an implementation of ζ(*x*) per se, but it does have an implementation of ζ(*x*)-1. Why not just implement ζ itself? Because for large *x*, ζ(*x*) is close to 1, so more precision can be saved by computing ζ – 1.

Putting these pieces together, here’s code to randomly generate triples of 128-bit integers, see how often they’re coprime, and compare the result to 1/ζ(3).

from random import getrandbits from fractions import gcd from functools import reduce from scipy.special import zetac def riemann_zeta(x): return zetac(x) + 1 k = 3 size = 128 N = 10000 coprime_count = 0 for _ in range(N): x = [getrandbits(size) for _ in range(k)] if reduce(gcd, x) == 1: coprime_count += 1 print(coprime_count/N) print(1/riemann_zeta(k))

When I ran this I got

0.8363 0.831907372581

Simulation agrees with theory to two decimal places, which is just what we should expect from 10,000 repetitions. (We’d expect error on the order of 1/√*N*.)

Here’s what I got when I changed `k`

to 4:

0.9234 0.923938402922

And here’s what I got for `k`

set to 5:

0.9632 0.964387340429

Thanks for the post! A long time ago, I misremembered something about coprime integers and (pi^2) / 6; it’s nice to see the correct formulation.

Some notes regarding Python 3.6:

– I got a DeprecationWarning about fractions.gcd() in favor of math.gcd()

– It looks like Scipy also has a generalized scipy.special.zeta(x, q), which simplifies to your riemann_zeta when q is 1 or None.

Thanks again!