From a lecture by Gregory Chaitin:

We all know that the computer is a very practical thing out there in the real world … But what people don’t remember as much is that really — I’m going to exaggerate, but I’ll say it — the computer was invented in order to help to clarify … a philosophical question about the foundations of mathematics.

I thought they were invented (or, better, funded) to win a war, like most technological breakthroughs.

Yes John, I agree with you in your comment.

I expected to hear that people were looking for automated solutions to problems of ballistics, which puts me in agreement with Stazbo Horn as expecting a military origin.

Maybe they were invented to speed up calculations in financial operations. I guess money and war are the origin of a good deal of inventions we have today.

Chaitin says he’s exaggerating, but he’s not, at least not much. The motivation for Turing’s hypothetical machine was specifically Hilbert’s 10th problem. And there were other related problems in logic and mathematics in the air at the time. There were also scientific, military, and business motivations for developing computers, but those came later.

It depends how far back you want to go, one could argue it was for navigation (the Babbage machine). Followed by automating simple boring tasks (picking weave patterns on a loom, census consolidation). I might be wrong but my idea of a Turing machine is it was kind of a generalization and in a way simplification of already existing mechanical computers in that it didn’t really think of one big set of input that determines how dozens of parts of a machine work and generates its output. It was more of a one bit at a time and “magic” followed by output kind of thing.

I think strictly speaking Chaitin is right. Computation, the theoretical basis needed before physical computers could be built, was invented as a by-product of answering Hilbert’s tenth problem.

Once the theory existed, the military saw applications and funded the actual construction of such machines. But that wouldn’t have been possible without the ideas fundamentally being in place already.

@Marc

Other way around. A theoretical basis for computation needn’t exist before building useful physical computers any more than a theory of gravity need exist before building a pyramid. The theory is useful but it isn’t required. You can just start piling rocks on top of each other.

Similarly, you can make computers without knowing the theory. You can make an adder and a register and some kind of input device and you’ve got yourself a calculator. No knowledge of Turing machines or completeness required. The theory will tell you what the limits are.

A better title for this article might be “Why computing theory was invented.”

@Brendan

I think you underestimate the amount of theory required to build the pyramids. “Practical thinkers” often can’t go beyond the “obvious” fact that you can only pile rocks so high until they fall over. It takes a conceptual leap and some theory to work out how to go beyond that.

@Marc

Sure, building a pyramid might require a lot of theory about gravity but it wouldn’t require relativity or string theory, or even Newton’s laws of motion. Similarly, most computer programs (machines) are built without any understanding of computation theory. If you want to reason about what happens when the pyramid is traveling near the speed of light, you need more advanced physics theory. If you want to reason about hypothetical computer systems, you need advanced computation theory.

Don’t underestimate the power of trial and error. Example Edison tried hundreds of combinations of filiment/bulb. The Egyptians/aliens could just have piled a bunch of rocks in small scale and confirmed that things should work. They were using slave labor anyways so why would they care? Same thing with computers: I don’t need a formal proof that my algorithm works I can just get a hand wavy idea code it up and see what happens.

@Brendan

I suggest that you read (or re-read) Turing’s original paper. It is a pretty practical and low-level first definition of a computer. Hardly “the theory of relativity”.

I believe I saw something a few months ago about how the pyramids have a nearly optimal design. It’s been too long and I don’t have a source, but I’m pretty sure someone was arguing that the Egyptians had a good idea how much weight the land could bear, how tall their construction techniques could hold, etc. They weren’t doing Newtonian physics, but they weren’t entirely flying by the seat of their pants either.

Trial and error might get you fairly far with single-purpose machines. But a programmable computer is another matter. I’d think you’d have to have some guiding theory, though not necessarily a theory as complete as that of Church or Turing.

@Marc

Computing theory didn’t predate computers. Your initial post implied you thought it did.

I think I confused things by bringing up pyramids.

@Brendan,

The theory of computation does predate the computer. Even von Neuman’s idea of storing the “algorithm” in the memory of the computer instead of hardwiring it comes from the notion of Universal Turing machine.

You don’t need to understand the theory of computation to check that your algorithm works, but you need the theory of computation to come up with a concept such as “algorithm”.

Bottom line: one thing is inventing computers and another is building them. The former comes first. Then the latter.

I suggest the easy-to-read Logicomix, about the life and career of Bertrand Russell.

I recommend the book “The First Computers—History and Architectures” by Raúl Rojas and Ulf Hashagen published by MIT Press. It’s clear from reading this that physical machines predated well-developed theories by Turing and Von Neumann.

Yes, computer technology was in many ways driven by war efforts. Part of the reason you don’t often hear about Konrad Zuse is that it sucks to have to know that someone on the Axis side of WWII developed some of early computing machinery that led to more theoretically complete machines later on. He wasn’t a Nazi per se though, but still, wasn’t against them either…

Anyway, some of the earliest machines discussed in the book date back to the 1930s. That’s 6 years prior to Alan Turing’s paper defining what is now called a Turing Machine.

perhaps the question here is what we call a computer. Is the Jacquard loom a computer? Is a calculator a computer? Not in the modern terms. To my understanding, at least, a computer is a universal Turing machine.

Interestingly enough, the Jacquard loom gave rise to punched cards, and if topological quantum computation ever takes off, we will go back to computing by braiding, although this time it will be anyons on a topologically multiply connected canvas.

“Konrad Zuse was not against Nazis either” is one of the most stupid comments that was ever made! I feel even dumber for having read it.

Zuse was an engineer and a German citizen, at those days, there were other reason for fighting in a war than being a Nazi, like nationalism that was much stronger in every country.

I’m glad that von Braun was not too much “not against Nazis” for creating rockets as space vehicles and ICBM.

I’m starting to think that this is all a matter of definitions. What do we mean by “computer” and by “theory”?

For a generous enough definition of “computer”, (as Alex mentions) the Jacquard loom predates modern computation theory by a long way.

On the other front, many people seem to define “theory” to mean “ivory tower embellishments of no practical use that always come after the real invention has happened”. I don’t find that definition all that useful, and prefer something more like “ideas and systems of thought that help to plan and organize practical efforts”.

The theoretical work of Turing and many others did influence and affect the refinement of computers as we know them today. There may be a romantic populism to the idea that dumb luck and persistence could produce a great and complex invention like the computer (or the pyramids for that matter) but such notions don’t hold up to closer scrutiny.

Sixty years ago “Computer” was also job title, most famously at the Manhattan project, where teams of women carried out repetitive calculations (in triplicate).

The Bent Pyramid at Dashur suggests there was a fair amount of trial and error in pyramid design.

@Alex,

I consider a woman with a set of instructions and a stack of paper and pencils to be a computer. Compare her to the model of a Turing Machine. It has a table of state transitions with inputs and outputs (instructions), read/write memory (paper, pencils, erasers). If she didn’t make mistakes, wouldn’t quit her job, lived forever, and had infinite paper and pencils, she would look a lot like a Turing Machine. All the analysis and reasoning you can do about Turing Machines would apply to her.

This lines up with what the original point of this blog post was. Computing theory was developed to answer fundamental questions about the nature of mathematics. Are there numbers that are incalculable? Turing’s work answered this question by analyzing a particular notion of a computer. Church’s work answered the same question by using a class of recursive functions that represent calculation.

My reaction to the comments:

The general purpose computer was invented in order to help to clarify a philosophical question about the foundations of mathematics by both Turing (Turing machine) and Church (Lambda calculus). Giant calculators existed before them (see the 1890 US Census) and little calculators much further back (see the abacus).

However, little and big calculators lacked the if statement and were hard-coded to specific problems instead of having the wonderful universality of Turing and Church’s idea. Certainly the early analog machines could calculate thus predict the phases of the moon or tides but would not have been able to store and play Creed’s entire awful discography or reproduce the wide variety of pornography available on the internet today without rebuilding/rewiring the machines for each task.

“There’s nothing so practical as a good theory”—Kurt Lewin

i think it was invented so that there was an easy way to store and get to information……”supper computer”