The Butterfly Effect is Much Worse Than We Thought
A butterfly flaps its wings in China and causes a Tornado in Texas. Who has not heard of the “Butterfly Effect”. A new paper now says that the Butterfly effect might even be more dramatic than we thought. You don’t need butterflies, some wiggling molecule will do. Let’s have a look.
The atmosphere around our planet is governed by the famous Navier-Stokes equation. It’s a non-linear equation that’s notoriously hard to solve and gives rise to turbulence and chaos. If you figure out how to solve it, the Clay Institute has a 1 million dollars prize on it, so if you can spare some time on the weekend, maybe give it a go.
The 1960s were in some sense simpler times than today, but solving the Navier Stokes equation was as difficult then as now. This is why Edward Lorenz in his 1969 paper on the butterfly effect used a very simplified set of equations. They were inspired by what’s going on in Earth’s atmosphere but far from the real thing. It’s now known as the Lorenz model.
The Lorenz model has become the most famous example for chaos. This is because if you solve the equations, their solutions are very sensitive to the exact starting point. Make it a tiny little bit different, and they will look similar initially, but then diverge until they are completely uncorrelated. You can plot the solutions as curves in an abstract, 3-dimensional space and they converge to a peculiar shape that, coincidentally, resembles a butterfly.
You can find this sensitivity to the exact initial conditions in many other systems, like the double pendulum. It’s the hallmark of chaos. This is often what people think of as the butterfly effect, the sensitivity to initial conditions.
But really Lorenz was saying more than that. He was saying that small disturbances in one place can increase to large disturbances elsewhere. Though he wasn’t talking about butterflies in his paper but about seagulls. Then again, you know, for a physicist that’s basically the same anyway.
Lorenz’s famous paper about the butterfly effect can’t tell us what the effect of a butterfly in China is because the model he used doesn’t have any notion of places to begin with.
Lorenz made it very clear that he was just conjecturing the effects of a localized small disturbance, a sea gull crapping or whatever. Flapping I mean, flapping.
He turned out to be correct, though, as so often in science, the true story is vastly more complicated. For one thing, the weather isn’t just chaotic. It’s sometimes chaotic and sometimes not. This is why on some days in some places a weather forecast is good for two weeks and elsewhere they can’t even get the next morning right. The other thing is that well, we don’t have data for butterflies flapping in China, so no one could actually test whether Lorenz’ was right.
But scientists did confirm from observations that perturbations from scales like maybe 100 kilometres do end up influencing much larger area, potentially the weather on the entire world. So basically, Lorenz was right. And it’s not just about size, it’s more importantly also about energy. It’s that changes which require only very little energy can end up moving around huge amounts of energy.
The new paper now looked at the question just how small a disturbance can be to influence larger scales. Is a butterfly large enough? Do you need an elephant flapping its ears? Maybe a jumbo jet? Well, they find, amazingly and rather concerningly, that even the motions of molecules are enough to trigger turbulence all over the place. That’s right, molecules. They write that their computer simulation suggests that “Even the inevitably present molecular noise … is sufficient to trigger spontaneous stochasticity.”
This is a super important result, because it affects our understanding of so many systems from the climate to nuclear fusion to galaxy formation. You see, since they can’t solve the Navier-Stokes equation, scientists just approximate solutions with computer simulation. But these simulations always use some grid of finite size. And if you think that there’s a butterfly effect going on below the scale of your grid, then you have to assume that there’s some energy propagating up from the small to the large sizes. In practice they do this by adding some sort of noise.
You might then think that if computers get more powerful, you can make the grid smaller, and eventually you’ll capture all sources of noise and get much more accurate predictions. But this new paper shows that this is basically impossible because you’d have to go down to the size of molecules. Concretely, they write that “For climate models, even if the projected goal of 1 km horizontal resolution in the next decade is achieved, such refined resolution will not obviate the need for stochastic models.”
They don’t say anything about galaxy formation, but I expect this to matter there, too. And maybe Jonathan Oppenheim with his postquantum gravity and the stochastic noise may want to have a close look at this paper.
Keep reading with a 7-day free trial
Subscribe to Science without the gobbledygook to keep reading this post and get 7 days of free access to the full post archives.