Wouldn’t trade places

Last week at the Heidelberg Laureate Forum, I was surrounded by the most successful researchers in math and computer science. The laureates had all won the Fields Medal, Abel Prize, Nevanlinna Prize, or Turing Award. Some had even won two of these awards.

I thought about my short academic career [1]. If I had been wildly successful, the most I could hope for would be to be one of these laureates. And yet I wouldn’t trade places with any of them. I’d rather do what I’m doing now than have an endowed chair at some university. Consulting suits me very well. I could see teaching again someday, maybe in semi-retirement, but I hope to never see another grant proposal.

* * *

[1] I either left academia once or twice, depending on whether you count my stint at MD Anderson as academic. I’d call my position there, and even the institution as a whole, quasi-academic. I did research and some teaching there, but I also did software development and project management. The institution is a hospital, a university, a business, and a state agency; it can be confusing to navigate.

The great reformulation of algebraic geometry

“Tate helped shape the great reformulation of arithmetic and geometry which has taken place since the 1950’s.” — Andrew Wiles

At the Heidelberg Laureate Forum I had a chance to interview John Tate. In his remarks below, Tate briefly comments on his early work on number theory and cohomology. Most of the post consists of his comments on the work of Alexander Grothendieck.

* * *

JT: My first significant work after my thesis was to determine the cohomology groups of class field theory. The creators of the theory, including my thesis advisor Emil Artin, didn’t think in terms of cohomology, but their work could be interpreted as finding the cohomology groups H0, H1, and H2.

I was invited to give a series of three talks at MIT on class field theory. I’d been at a party, and I came home and thought about what I’d talk about. And I got this great idea: I realized I could say what all the higher groups are. In a sense it was a disappointing answer, though it didn’t occur to me then, that there’s nothing new in them; they were determined by the great work that had already been done. For that I got the Cole prize in number theory.

Later when I gave a talk on this work people would say “This is number theory?!” because it was all about cohomology groups.

JC: Can you explain what the great reformulation was that Andrew Wiles spoke of? Was it this greater emphasis on cohomology?

JT: Well, in the class field theory situation it would have been. And there I played a relatively minor part. The big reformulation of algebraic geometry was done by Grothendieck, the theory of schemes. That was really such a great thing, that unified number theory and algebraic geometry. Before Grothendieck, going between characteristic 0, finite characteristic 2, 3, etc. was a mess.

Grothendieck’s system just gave the right framework. We now speak of arithmetic algebraic geometry, which means studying problems in number theory by using your geometric intuition. The perfect background for that is the theory of schemes. ….

Grothendieck ideas [about sheaves] were so simple. People had looked at such things in particular cases: Dedekind rings, Noetherian rings, Krull rings, …. Grothendieck said take any ring. … He just had an instinct for the right degree of generality. Some people make things too general, and they’re not of any use. But he just had an instinct to put whatever theory he thought about in the most general setting that was still useful. Not generalization for generalization’s sake but the right generalization. He was unbelievable.

He started schemes about the time I got serious about algebraic geometry, as opposed to number theory. But the algebraic geometers classically had affine varieties, projective varieties, … It seemed kinda weird to me. But with schemes you had a category, and that immediately appealed to me. In the classical algebraic geometry there are all these birational maps, or rational maps, and they’re not defined everywhere because they have singularities. All of that was cleared up immediately from the outset with schemes. ….

There’s a classical algebraic geometer at Harvard, Joe Harris, who works mostly over the complex numbers. I asked him whether Grothendieck made much of a difference in the classical case — I knew for number theorists he had made a tremendous difference — and Joe Harris said yes indeed. It was a revolution for classical algebraic geometry too.

Related: Applied number theory

Mental crypto footnotes

I spoke with Manuel Blum this afternoon about his password scheme described here. This post is a few footnotes based on that conversation.

When I mentioned that some people had reacted to the original post saying the scheme was too hard, Blum said that he has taught the scheme to a couple children, 6 and 9 years old, who can use it.

He also said that many people have asked for his slide summarizing the method and asked if I could post it.  You can save the image below to get the full-sized slide.

This slide and my blog post both use a 3-digit password for illustration, though obviously a 3-digit password would be easy to guess by brute force. I asked Blum how long a password using his scheme would need to be so that no one with a laptop would be able to break it. He said that 12 digits should be enough. Note that this assumes the attacker has access to many of your passwords created using the scheme, which would be highly unlikely.

Update: This algorithm can be broken fairly easily. See comments here.

Proof maintenance

Leslie Lamport coined the phrase “proof maintenance” to describe the process of producing variations of a proof over time.

It’s well known that software needs to be maintained; most of the work on a program occurs after it is “finished.” Proof maintenance is common as well, but it is usually very informal.

Proofs of any significant length have an implicit hierarchical structure of sub-proofs and sub-sub-proofs etc. Sub-proofs may be labeled as lemmas, but that’s usually the extent of the organization. Also, the requirements of a lemma may not be precisely stated, and the propositions used to prove the lemma may not be explicitly referenced. Lamport recommends making the hierarchical structure more formal and fine-grained, extending the sub-divisions of the proof down to propositions that take only two or three lines to prove. See his paper How to write a 21st century proof.

When proofs have this structure, you can see which parts of a proof need to be modified in order to produce a proof of a new related theorem. Software could help you identify these parts, just as software tools can show you the impact of changing one part of a large program.

Related: Formal validation methods

Uses for orthogonal polynomials

When I interviewed Daniel Spielman at this year’s Heidelberg Laureate Forum, we began our conversation by looking for common mathematical ground. The first thing that came up was orthogonal polynomials. (If you’re wondering what it means for two polynomials to be orthogonal, see here.)

JC: Orthogonal polynomials are kind of a lost art, a topic that was common knowledge among mathematicians maybe 50 or 100 years ago and now they’re obscure.

DS: The first course I taught I spent a few lectures on orthogonal polynomials because they kept coming up as the solutions to problems in different areas that I cared about. Chebyshev polynomials come up in understanding solving systems of linear equations, such as if you want to understand how the conjugate gradient method behaves. The analysis of error correcting codes and sphere packing has a lot of orthogonal polynomials in it. They came up in a course in multi-linear algebra I had in grad school. And they come up in matching polynomials of graphs, which is something people don’t study much anymore. … They’re coming back. They come up a lot in random matrix theory. … There are certain things that come up again and again and again so you got to know what they are.

* * *

More from my interview with Daniel Spielman:

Mathematical beauty

Michael Atiyah quoted Hermann Weyl in the opening talk at the second Heidelberg Laureate Forum:

I believe there is, in mathematics, in contrast to the experimental disciplines, a character which is nearer to that of free creative art.

There is evidence that the relation of artistic beauty and mathematical beauty is more than an analogy. Michael Atiyah recently published a paper with Semir Zeki et al that suggests the same part of the brain responds to both.

Love locks

If you walk across the Seine in Paris on the Pont des Arts you’ll see thousands and thousands of love locks. I saw this morning that Heidelberg has its own modest collection of love locks on the Old Bridge across the Neckar.

love locks on Old Bridge across Neckar

These may be new. If they were here last year, I didn’t notice them.

There are several other points along the Old Bridge that have locks but nowhere are there very many.

love locks on Pont des Arts across Seine

Photo credit: Disdero via Wikimedia Commons

Common sense and statistics

College courses often begin by trying to weaken your confidence in common sense. For example, a psychology course might start by presenting optical illusions to show that there are limits to your ability to perceive the world accurately. I’ve seen at least one physics textbook that also starts with optical illusions to emphasize the need for measurement. Optical illusions, however, take considerable skill to create. The fact that they are so contrived illustrates that your perception of the world is actually pretty good in ordinary circumstances.

For several years I’ve thought about the interplay of statistics and common sense. Probability is more abstract than physical properties like length or color, and so common sense is more often misguided in the context of probability than in visual perception. In probability and statistics, the analogs of optical illusions are usually called paradoxes: St. Petersburg paradox, Simpson’s paradox, Lindley’s paradox, etc. These paradoxes show that common sense can be seriously wrong, without having to consider contrived examples. Instances of Simpson’s paradox, for example, pop up regularly in application.

Some physicists say that you should always have an order-of-magnitude idea of what a result will be before you calculate it. This implies a belief that such estimates are usually possible, and that they provide a sanity check for calculations. And that’s true in physics, at least in mechanics. In probability, however, it is quite common for even an expert’s intuition to be way off. Calculations are more likely to find errors in common sense than the other way around.

Nevertheless, common sense is vitally important in statistics. Attempts to minimize the need for common sense can lead to nonsense. You need common sense to formulate a statistical model and to interpret inferences from that model. Statistics is a layer of exact calculation sandwiched between necessarily subjective formulation and interpretation. Even though common sense can go badly wrong with probability, it can also do quite well in some contexts. Common sense is necessary to map probability theory to applications and to evaluate how well that map works.