Algorithm used for world record pi calculations

The following algorithm is based on work of Ramanujan and has been used in several world-record calculations of pi.

Initialize a0 = 6 – 4 √2 and y0 = √2 – 1. Then compute

y_{n+1} = frac{1 - (1-y_n^4)^{1/4}}{1 + (1-y_n^4)^{1/4}}

and

a_{n+1} = (1 + y_{n+1})^4 a_n - 2^{2n+3} y_{n+1}(1 + y_{n+1} + y_{n+1}^2)

The terms an form a sequence of approximations to 1/π. The error in each approximation is given by

0 < a_n - frac{1}{pi} < 16cdot 4^n exp(-2cdot 4^n pi)

This says that the number of correct digits roughly quadruples after each step. To see this, note that the number of correct decimal places after the nth step is the negative of the logarithm base 10 of the error:

frac{2cdot 4^n pi - n ln 4 - ln 16}{ln 10}

[The error term goes to zero so quickly that you cannot (in ordinary precision) compute the error bound and then take logarithms; the exp(-2 π 4n) term will underflow to zero for n larger than 3. (See why here.) You have to take the log of the error term directly before evaluating numerically.]

The number of correct digits quadruples at each step because the leading term in the numerator above is 4n.

To give a few specifics, a1 is accurate to 9 digits, a2 is accurate to 41 digits, and a3 is accurate to 171 digits. After 10 steps, a10 is accurate to over 2.8 million digits.

The algorithm given here was state of the art as of 1989. It was used to set world records in 1986. I don’t know whether it has been used more recently. See more here.

According to this page, π has been calculated to 6.4 billion digits. You can beat this record if you can carry out 16 steps of the method described here. a16 would be accurate to over 11 billion digits.

Update (18 April 2012): The algorithm used most recently for world record calculations for pi has been the Chudnovsky algorithm. As of October 2011, the record was over 10 trillion digits.

Related posts:

A Ramanujan series for calculating pi

Ramanujan discovered the following remarkable formula for computing π:

frac{1}{pi} = sum_{n=0}^infty {2n choose n}^3 frac{42n + 5}{2^{12n+4}}

This is not the most efficient series for computing π. My next post will give a more efficient method, also based on work of Ramanujan. But the series above is interesting for reasons explained below.

Notice that the denominator of each term in the sum above is a power of 2. This means the partial sums of the series can be represented exactly in binary. We’ll see that each term in the series adds six bits precision to the sum once we’ve added enough terms.

To understand how to use this series, we need to estimate the binomial coefficient term. Stirling’s approximation shows that

{2n choose n} sim frac{2^{2n}}{sqrt{pi n}}

This tells us that the nth term in the series is a rational number with numerator something like 2^6n and denominator 2^(12n+4). Therefore the nth term is on the order of 2^(-6n-4) and so the series converges quickly. The first three terms illustrates this:

frac{1}{pi} = frac{5}{16} + frac{376}{65536} + frac{19224}{268435456} + cdots

The error in summing up a finite number of terms is approximately the first term in the remainder, so just a few terms leads to an accurate approximation for 1/π and hence an accurate approximation for π.

To be a little more precise, when we sum the series from n = 0 to n = N, the error in approximating 1/π is on the order of the next term, 2^(-6N-10). Said another way, summing up to the Nth term computes 1/π to 6N + 10 bits. For example, suppose we wanted to compute 1/π to 200 decimal places. That’s about 664 bits, and so 108 terms of the series should be enough.

We glossed over a detail above. We said that the nth term is on the order of 2^(-6n-4). That’s true for sufficiently large n. In fact, we can say that the nth term is less than 2^(-6n-4), but only for large enough n. How large? We need the denominator term (π n)^3/2 to be larger than the numerator term 42n + 5. This happens if n is at least as large as 58.

One advantage to using this series to compute π is that you could compute later blocks of bits without having to compute the earlier blocks, provided you’re interested in the bits beyond those contributed by 58th term in the series.

Reference

Related post: Two useful asymptotic series

Digital workflow

William Turkel has a nice four-part series of blog posts entitled A Workflow for Digital Research Using Off-the-Shelf Tools. His four points are

  1. Start with a backup and versioning strategy.
  2. Make everything digital.
  3. Research 24/7 (using RSS feeds).
  4. Make local copies of everything.

Also by William Turkel, The Programming Historian, “an open-access introduction to programming in Python, aimed at working historians (and other humanists) with little previous experience.”

Related post: Create offline, analyze online

Engineers save millions of lives in Japan

From Dave Ewing via Roberto Montagna:

The headline you won’t be reading: “Millions saved in Japan by good engineering and government building codes”. But it’s the truth.

The loss of life in Japan is tragic, but it would have been far worse without good engineering.

Update: As Tim points out in the comments below,The New York Times did publish a story headlined Japan’s Strict Building Codes Saved Lives. This is to be commended since it’s natural to only see the people who died and not the people who did not. Along these lines, I wonder how many people did not die mining coal to generate the electricity that nuclear power has provided Japan.

Related posts:

Marshall McLuhan reading technique

From Douglas Copeland’s book on Marshall McLuhan:

Marshall … didn’t have the patience to work through a book that didn’t interest him from the start. He even developed a technique to suit his impatience: whenever he picked up a new book, he would turn to page 69, and if that page didn’t interest him, he wouldn’t read the book.

Marshall McLuhan: You Know Nothing of My Work!

Related post: Reading as inclination leads

Sonnet primes

The previous post showed how to list all limerick primes. This post shows how to list all sonnet primes. These are primes of the form ababcdcdefefgg, the rhyme scheme of an English (Shakespearean) sonnet, where the letters a through g represent digits and a is not zero.

Here’s the Mathematica code.

number[s_] := 10100000000000 s[[1]] + 1010000000000 s[[2]] +
    1010000000 s[[3]] + 101000000 s[[4]] +
    101000 s[[5]] + 10100 s[[6]] + 11 s[[7]]

test[n_] := n > 10^13 && PrimeQ[n]

candidates = Permutations[Table[i, {i, 0, 9}], {7}];

list = Select[Map[number, candidates], test];

The function Permutations[list, {n}] creates a list of all permutations of list of length n. In this case we create all permutations the digits 0 through 9 that have length 7. These are the digits a through g.

The function number turns the permutation into a number. This function is applied to each candidate. We then select all 14-digit prime numbers from the list of candidates using test.

If we ask for Length[list] we see there are 16,942 sonnet primes.

As mentioned before, the smallest of these primes is 10102323454577.
The largest is 98987676505033.

Related post: Limerick primes

Limerick primes

The other day, Futility Closet posted this observation:

10102323454577 is the smallest 14-digit prime number that follows the rhyme scheme of a Shakespearean sonnet (ababcdcdefefgg).

I posted this on AlgebraFact and got a lot of responses. One was from Matt Parker who replied that 11551 was the smallest prime with a limerick rhyme scheme.

So how many limerick primes are there? Well, there aren’t many candidates. A limerick prime has to have the form AABBA where A is an odd digit and B is any digit other than A. So for each of five choices of A, there are nine possible B’s. Here’s a Mathematica program to do a brute force search for limerick primes.

For[ j = 0, j < 5, j++,
    For[ k = 0, k < 10, k++,
        x = (2 j + 1)*11001 + 110 k;
        If[ PrimeQ[x], Print[x] ]]]

It turns out there are eight limerick primes:

  • 11551
  • 33113
  • 33223
  • 33773
  • 77447
  • 77557
  • 99119
  • 99559

See the next post for Mathematica code to list all sonnet primes.

Related posts:

Not enchanted with "Enchantment"

I’ve read a fair number of business books, but I stopped reading them when they all started to sound alike. I have limited time for reading and so I want to read books that “blow my hair back” as Will Hunting would say.

I made an exception to my abstinence from business books when Guy Kawasaki’s publisher offered me a review copy of his new book Enchantment. The book confirmed my decision to lay off the business literature. I was surprised how much of it I’d already read elsewhere before it arrived. Much of it is a compilation of ideas and stories that were popular on the web last year. Enchantment isn’t a bad book, it just isn’t very original.

This made me think of Robert Ghrist‘s quip about new books:

Reading anything less than 50 years old is like drinking new wine: permissible once or twice a year and usually followed by regret and a headache.

I can’t imagine that Enchantment would stand such a test of time. Hardly anyone will be reading it a couple years from now, much less 50 years from now.

Related posts:

Augustine, Leibowitz, and evolution

The following paragraph is from the science fiction novel A Canticle for Leibowitz:

A fourth century bishop and philosopher. He [Saint Augustine] suggested that in the beginning God created all things in their germinal causes, including the physiology of man, and that the germinal causes inseminate, as it were, the the formless matter — which then gradually evolved into the more complex shapes, and eventually Man. Has this hypothesis been considered?

A Canticle for Leibowitz is set centuries after a nuclear holocaust. The war was immediately followed by the “Simplification.” Survivors rejected all advanced technology and hunted down everyone who was even literate. At this point in the book, a sort of Renaissance is taking place. The question above is addressed to a scientist who is explaining some of the (re)discoveries taking place. The scientist’s response was

“I’m afraid it has not, but I shall look it up,” he said, in a tone that indicated he would not.

Was the reference to Augustine simply made up for the novel, or is there something in Augustine’s writings that the author is alluding to? If so, does anyone know what in particular he may be referring to? Is such a proto-Darwinian reading of Augustine fair?

Weekend miscellany

Humor

How to be a hipster
Charlie Sheen v Muammar Gaddafi

Math

Galton’s Bayesian machine from 1877
“Mnemonic” is a mnemonic (Poisson density)
Fibonacci matrix identity
The Optimization Edge (Using optimization to run a business)

Python programming

Python tricks from Peter Norvig
Interview with Travis Oliphant on NumPy
One tweet per day on scientific computing in Python

Eclectic

Hyperauthorship (papers with a ridiculous number of authors)
Questions intellectuals ask about Christianity
Upgrading from Windows 1.0 to Windows 7
How to make a tungsten light bulb filament

Thomas Jefferson and preparing for meetings

Here’s an interesting historical anecdote from Karl Fogel’s Producing Open Source Software on the value of preparing for meetings.

In his multi-volume biography of Thomas Jefferson, Jefferson and His Time, Dumas Malone tells the story of how Jefferson handled the first meeting held to decide the organization of the future University of Virginia. The University had been Jefferson’s idea in the first place, but (as is the case everywhere, not just in open source projects) many other parties had climbed on board quickly, each with their own interests and agendas.

When they gathered at that first meeting to hash things out, Jefferson made sure to show up with meticulously prepared architectural drawings, detailed budgets for construction and operation, a proposed curriculum, and the names of specific faculty he wanted to import from Europe. No one else in the room was even remotely as prepared; the group essentially had to capitulate to Jefferson’s vision, and the University was eventually founded more or less in accordance with his plans.

The facts that construction went far over budget, and that many of his ideas did not, for various reasons, work out in the end, were all things Jefferson probably knew perfectly well would happen. His purpose was strategic: to show up at the meeting with something so substantive that everyone else would have to fall into the role of simply proposing modifications to it, so that the overall shape, and therefore schedule, of the project would be roughly as he wanted.

Related posts:

Psychological encapsulation

A piece of software is said to be encapsulated if someone can use it without knowing its inner workings. The software is a sort of black box. It has a well-defined interface to the outside world. “You give me input like this and I’ll produce output like that. Never mind how I do it. You don’t need to know.”

I think software development focuses too much on logical encapsulation. Code is logically encapsulated if, in theory, there is no logical necessity to look inside the black box.

Maybe there would be no need to look inside the code if everything were working correctly, but there’s a bug. Or maybe the code isn’t buggy so much as incompletely documented. Maybe your inputs push the box beyond its designed limits and you don’t know that until you use it. Or maybe the code is taking too long to execute.

Maybe there’s nothing wrong with the code, but you don’t trust it. In that case, the code is logically encapsulated but not psychologically encapsulated. That lack of trust negates the psychological benefits of encapsulation. This may be the most insidious breach of encapsulation. A failure of logical encapsulation is objective and may easily be fixed.  A loss of confidence may be much harder to repair.

Developers may avoid using a component long after bugs have been fixed. Or they may use the component but be wary of it. They don’t concentrate their full mental energy on their own code because of a lack of trust in their dependencies. When a bug shows up in their code, they may waste time checking the untrusted component.

Psychological encapsulation might explain in part why people are more productive using tools they like. For example, Windows runs well for people who like Windows. Linux runs well for people who like Linux. Macs run well for people who like Macs. It’s as if the computer knows who its friends are.

Some of this is confirmation bias: we’re less likely to notice the flaws in things we like and more likely to notice the flaws in things we don’t like. But I think there’s more going on. If someone is using an operating system they distrust, they’re going to be less productive, even if the operating system performs flawlessly. Every time something goes wrong, they suspect it might be the operating system’s fault. Of course an error might be the operating system’s fault, but this is rare.

Related post: Opening black boxes