Merriam-Webster defines oscillation in various ways, including “a flow of electricity changing periodically from a maximum to a minimum” and “a flow periodically changing direction.”
How about an example?
Think of a Slinky toy (essentially just a metal spring). When you hold one end of the Slinky a few feet above the floor and then let the other end drop, the bottom of the spring falls to its lowest point as determined by physics, then bounces back up near to where it started, then falls again but not quite as far as the first time, and so on, until it settles at equilibrium. Those movements are a type of up-and-down oscillation.
But in the world of complex systems, oscillations don’t always work like the Slinky: they don’t always get smaller and then settle at equilibrium. In fact, depending on which actions we take in response to an initial event—and the timing of those actions—oscillations can get larger and larger!
Think of a seismograph as an earthquake ramps up. An action can cause more than an equal and opposite reaction. And the likelihood of amplification increases if all relevant factors aren’t taken into account prior to announcing or executing the action.
Let’s bring this back to systems thinking and system dynamics.
In her foundational book Thinking in Systems, Donella Meadows presents an example of a car dealer observing that an increase in sales is decreasing inventory. The dealer wants to wait long enough to make sure the sales increase isn’t just a blip. Then she starts ordering more cars to meet demand. But there’s a delay before the inventory becomes available, during which inventory drops even more as sales continue. So orders end up increasing again.
But on the back side of this dynamic, as the system catches up with earlier actions, inventory arrives… and arrives... and arrives… ultimately resulting in a glut of cars. Whoops. So orders will eventually be cut… and cut… and the oscillation will oscillate in the opposite direction.
It might seem like shortening the delays in the system, or shortening the dealer’s reaction time, could smooth out these oscillations. But, perhaps counterintuitively, Meadows writes that shortening delays may not have much effect. And shortening the dealer’s reaction time—allowing the dealer to act based on even less information—actually amplifies and increases the magnitude of oscillations! Instead, increasing reaction time and gaining more information before acting can smooth out oscillations and make them more manageable.
Of course, it’s possible to wait too long (to wit: our coming climate change disaster), during which time reinforcing loops can spiral nearly out of control. And it’s much easier to prevent a reinforcing loop from getting started than to slow down a reinforcing loop that’s already picked up momentum. (Reinforcing loops are, in essence, either vicious or virtuous spirals.)
Takeaway: finding the right response time delay is a delicate balance but an important key to keeping a system in relatively steady state.
As Meadows puts it, “… some delays can be powerful policy levers. Lengthening or shortening them can produce major changes in the behavior of systems.”1 The desire for soft landings is near-universal, but achieving one can be diabolically hard. Oscillations are one reason why.
Want to learn more?
This article by Meadows digs into the concept of leverage points. The article is long, and if you’re only interested in oscillations, those are discussed in the section labeled “9. The lengths of delays, relative to the rate of system changes.” But really, the entire article is worthwhile.
So, um, will the economy stick a soft landing this time?
The outcome remains to be seen. But oscillations—and their echoes—will play a key role.
-<>-<>-<>-
Extra, Extra!
Three links from the depths of my bookmark archives; think of these as tangential extras for curious readers:
1. The Terrifying Warning Lurking in the Earth’s Ancient Rock Record - by Peter Brannen in The Atlantic - One of the best articles I’ve read for perspective on climate change.
2. Chaos Engineering - Part 1 by Adrian Hornsby - On building intuition through chaos.
3. The Hardware/Software Interface Class videos - by Luis Ceze and Gaetano Borriello - Fabulous deep dive into low-level programming. Used to be a Coursera MOOC, but you can still watch the videos.
Meadows, Donella. Thinking in Systems, Chelsea Green Publishing, 2008, Chapter 2.