The comments in the previous post touched on surprising applications of math, so I thought I’d expand this theme into its own post. Below I’ll give a couple general examples of surprising applications and then I’ll give a couple more personal applications I found surprising.
Number theory has traditionally been the purest of pure mathematics. People study number theory for the joy of doing so, not to make money. At least that was largely true until the advent of public key cryptography. The difficulty of solving certain number theory problems now ensures the difficulty of decrypting private communication, or so we hope. (By the way, I’ve always thought Euler deserved part of the credit for the RSA encryption scheme. Maybe it should be called RSAE encryption. R, S, and A came up with the brilliant idea to apply E’s theorem to cryptography.)
Non-euclidean geometry started as a pure mathematical abstraction. Of course the physical world is Euclidean, but let’s see what happens if we monkey with Euclid’s fifth postulate. Then along came Einstein and suddenly the real world is non-Euclidean.
One personal application of math that I found surprising was using Fibonacci numbers in practical computation. Computing Fibonacci numbers is a computer science cliché, but I actually needed to compute Fibonacci numbers for a numerical integration problem. I wrote up the details in Fibonacci numbers at work.
Another application that surprised me was using the trapezoid rule for real work. The trapezoid rule is a crude numerical integration technique. It’s good for teaching because it’s very simple, but it’s not very accurate. Or so I thought. It’s not very accurate in general, but in the right circumstances, it can be extraordinarily accurate. I explain more in Three surprises with the trapezoid rule.
One surprising non-application has been differential equations. For the past three centuries, differential equations have been at the heart of applied math. One could argue that to first approximation, applied math equals differential equations and supporting material. But I personally have not had nearly as much opportunity to use differential equations professionally as I expected, even though that was my specialization in grad school.
Surprising use of mathematics (for me):
(1) Finite fields. Back when I studied them, I thought it would be a waste of time. I have written two papers filled with finite fields: http://arxiv.org/abs/0705.4676 and http://arxiv.org/abs/1008.1715 It turns out that finite fields are immensely useful in practice!
(2) Gray codes look like a geeky trick that nobody will ever need… but they are, again, very useful and quite interesting (I have recently used them quite a bit http://arxiv.org/abs/0909.1346).
(3) Linear algebra… I used to be quite bored by linear algebra… who cares about matrices… but they are amazingly useful in practice. Many, many problems boil down to linear algebra.
(4) Despite a very strong training in math., I found that I seriously lack probabilistic intuition. Probabilities are hugely important. Surprisingly so.
Surprising non-use:
(1) Analysis in general is vastly overrated. Elementary calculus is useful, up to optimization and Lagrangians, but even this usefulness is limited. I have definitively used them, but I could have done much of my work without it.
(2) Applied mathematics (including differential equations) is really not all that useful. At least, to me.
Daniel: Comparing our experience, I’ve had more use for analysis and less use for discrete math.
Regarding linear algebra, I’ve been surprised how small the matrices are that come up in linear algebra. When I was doing differential equations, everything eventually required solving huge systems of equations. The matrices I’ve used in statistics have been tiny by comparison.
John: solving linear mixed model equations one can easily end up with huge matrices. The nice thing is that in many applications they are quite sparse.
Jim Townsend successfully modelled racism with a negatively curved Riemannian space.
At least for a while, people were making money with a one-semester class on martingales.
This year I used Schur’s lema for finding the solution to a convex optimization problem. I never thought that group theory would show up in convex optimization although it has always been regarded as an extremely useful tool in physics, chemistry and material science.
I, too, studied differential equations in grad school and have rarely used them in my professional work. However, I often use related concepts that I first encountered in the context of differential equations: numerical integration, numerical linear algebra, root-finding, adaptive (mesh) refinement, and iterative systems (the DE’s little brother). So although I don’t solve DEs, the experience that I gained studying them have been well-utilized.
My surprise has been group theory, which I hated in school. Yet I somehow encounter some group about once per year. Often the issue is related to symmetries or permutations.
Rick: Here’s an idea from PDEs that I applied to statistics. When you’re solving a time-dependent PDE numerically, the solution doesn’t change much between time steps. So if you’re using iterative linear solvers, the solution from the previous time step makes an excellent starting point for the iteration at the next time step.
In Bayesian simulation, the posterior mode doesn’t move much when you add a little data. So the posterior mode from the previous data set makes an excellent starting point in finding the posterior mode after adding one observation.
I’d disagree that analysis and differential equations aren’t applied much. Differential equations, translated into large systems of linear equations, are at the core of engineering. I chastise newbie engineers who think linear algebra and ODE/PDE classes were a waste of time. It’s not really possible to understand the output unless you know what the sausage grinder did to the inputs. In the early days, you couldn’t understand the error codes unless you knew what a singular matrix was – though software has “progressed” to the point where we can now employ an infinite number of monkeys who might occasionally conjure up an airplane in between acts of Hamlet. :-)
Although I’m still looking for an application of the busy beavers…
This comes back to a previous post you had about the hardness of imagining the exponential growth. Can someone imagine a series increasing so fast that it’s not even computable? It blows my mind.
In fact, RSA were not the first to discover the RSA algorithm.
UK spook Cliff Cocks beat them to it:
http://www.gchq.gov.uk/Press/Pages/100th-IEEE-milestone-award.aspx
I agree that linear algebra is surprisingly applied, and I only understood the importance of LA when I came across the excellent book by Gilbert Strang. All those abstract concepts of vector spaces and subspaces suddenly made sense!
I think the whole area of optimisation and stochastic processes is hugely applicable. In fact, I think almost all Math related to Operations Research has successful accomplishments.