Bayesian methods at Bletchley Park

From Nick Patterson’s interview on Talking Machines:

GCHQ in the ’70s, we thought of ourselves as completely Bayesian statisticians. All our data analysis was completely Bayesian, and that was a direct inheritance from Alan Turing. I’m not sure this has ever really been published, but Turing, almost as a sideline during his cryptoanalytic work, reinvented Bayesian statistics for himself. The work against Enigma and other German ciphers was fully Bayesian. …

Bayesian statistics was an extreme minority discipline in the ’70s. In academia, I only really know of two people who were working majorly in the field, Jimmy Savage … in the States and Dennis Lindley in Britain. And they were regarded as fringe figures in the statistics community. It’s extremely different now. The reason is that Bayesian statistics works. So eventually truth will out. There are many, many problems where Bayesian methods are obviously the right thing to do. But in the ’70s we understood that already in Britain in the classified environment.

Alan Turing

5 thoughts on “Bayesian methods at Bletchley Park

  1. You might enjoy sections 18.3 and 18.4 from the book “Information Theory, Inference, and Learning Algorithms”, by David J.C. MacKay, where the author talks a bit about the process at Bletchley, and the ‘ban’ unit, which measures changes in (log_10) probability, named for the village of Banbury.

  2. Looks like RobertM’s arxiv.org link contains the original paper my reference used and which has a section titled ‘Decibanage’.

Comments are closed.