It all boils down to linear algebra

When I was in college, my view of applied math was something like the following.

Applied math is mostly mathematical physics. Mathematical physics is mostly differential equations. Numerical solution of differential equations boils down to linear algebra. Therefore the heart of applied math is linear algebra.

I still think there’s a lot of truth in the summary above. Linear algebra is very important, and a great deal of applied math does ultimately depend on efficient solutions of large linear systems. The weakest link in the argument may be the first one: there’s a lot more to applied math than mathematical physics. Mathematical physics hasn’t declined, but other areas have grown. Still, areas of applied math outside of mathematical physics and outside of differential equations often depend critically on linear algebra.

I’d certainly recommend that someone interested in applied math become familiar with numerical linear algebra. If you’re going to be an expert in differential equations, or optimization, or many other fields, you need to be at least familiar with numerical linear algebra if you’re going to compute anything. As Stephen Boyd points out in his convex optimization class, many of the breakthroughs in optimization over the last 20 years have at their core breakthroughs in numerical linear algebra. Improved algorithms have sped up the solution of very large systems more than Moore’s law has.

It may seem questionable to say that linear algebra is at the heart of applied math because it’s linear. What about nonlinear applications, such as nonlinear PDEs? Nonlinear differential equations lead to nonlinear algebraic equations when discretized. But these nonlinear systems are solved via iterations of linear systems, so we’re back to linear algebra.

 

5 thoughts on “It all boils down to linear algebra

  1. This especially goes for a lot of data science or computational problems. Often, you’ll find yourself working with something that could be called a vector and something that could called a matrix, doing matrix-vector calculations.

  2. I think that one of the reason why linear algebra is so important in numerical analysis is that systems of linear equations are exactly solvable by algorithms with a polynomial complexity ( solving a n*n system with Gauss method requires O(n^3) operations for example). From a numerical analysis point of view linear algebra can solve problems without generating consistency errors , so you can estimate separately the approximation errors (due to discretisation) from the rounding errors (due to linear algebra computations). It is possible to keep the same approach (reduction to linear equations) even for non-linear problems where computational stability become very important.

  3. Even if it’s not the heart (and I tend to agree with you that it is), it is certainly the foundation, and often part of the superstructure.

  4. Very true. The only thing I would add is that a surprising amount of pure math boils down to linear algebra as well.

    No one has ever written a popularization of linear algebra that I’m aware of, making it the greatest unsung hero in the mathematical sciences.

Comments are closed.