Chaotic systems are unpredictable. Or rather chaotic systems are not deterministically predictable in the long run.
You can make predictions if you weaken one of these requirements. You can make deterministic predictions in the short run, or statistical predictions in the long run.
Lyapunov exponents are a way to measure how quickly the short run turns into the long run. There is a rigorous definition of a Lyapunov exponent, but I often hear the term used metaphorically. If something is said to have a large Lyapunov exponent, it quickly becomes unpredictable. (Unpredictable as far as deterministic prediction.)
In a chaotic system, the uncertainty around the future state of the system grows exponentially. If you start with a small sphere of initial conditions at time t = 0, a sphere of diameter ε, that sphere becomes an ellipsoid over time. The logarithm of the ratio of the maximum ellipsoid diameter to the diameter of the initial sphere is the Lyapunov exponent λ. (Technically you have to average this log ratio over all possible starting points.)
With a Lyapunov exponent λ the uncertainty in your solution is bounded by ε exp(λt).
You can predict the future state of a chaotic system if one of ε, λ, or t is small enough. The larger λ is, the smaller t must be, and vice versa. So in this sense the Lyapunov exponent tells you how far out you can make predictions; eventually your error bars are so wide as to be worthless.
A Lyapunov exponent isn’t something you can easily calculate exactly except for toy problems, but it is something you can easily estimate empirically. Pick a cloud of initial conditions and see what happens to that cloud over time.
2 thoughts on “Lyapunov exponents”
I was confused by the statement: “chaotic systems are not deterministically predictable in the long run”. Why are they unpredictable (in the deterministic sense)? Suppose we have our initial value problem of a double pendulum, I understand that to solve for the motion of the pendula we’d have to use some numerical method to approximate the solution to their equations of motion. But then we get trajectories which are very sensitive to initial conditions, but these trajectories aren’t probabilistic right? The trajectories will be functions of time and these functions aren’t random variables right? What makes this behavior become not deterministic for large t? We may not have an explicit expression for the trajectories, but we can find some implicit expression that tells us what these trajectories will be. So what I don’t fully understand here is how does this sensitivity to an initial condition make these systems somehow turn probabilistic, or at least not deterministically predictable?
I have seen this topic come up many times and it has really puzzled me why chaotic systems are referred to as indeterministic. I really hope you can shine some light of wisdom on this comment. Thank you for your time
Maybe it would help to use a different words than deterministic and statistical. What I mean by those terms is the contrast between knowing where a particular particle will be versus knowing on average how a large ensemble of particles will be distributed.
If you take a tiny sphere around the initial conditions and evolve that sphere over time, it eventually fills the entire space. Since your initial conditions could have been any one of the points in that ball, you don’t know where in the space you’ve ended up. Every single point in the space is a potential point your initial point may have gone to.
The term “probabilistic” has some metaphysical overtones that can get in the way of understanding. What matters is that you don’t exactly know your starting conditions. It doesn’t matter why you don’t know them, e.g. whether it’s measurement precision or something “random.”
It’s very common to model any source of uncertainty as randomness, and we do that because it works well. After a while we start to think that all uncertainty really is randomness, whether it is or not, and that can be misleading. I wrote more about this in the post Random is as random does. It argues for a sort of instrumental view of randomness, a way of working with uncertainty that isn’t meant to be interpreted philosophically.