Sometimes risk mitigation works. It doesn’t feel that way because when risk mitigation works, nothing happens. But not all risks lead to catastrophe, as long as people tasked with mitigating those risks keep their eyes on the goal.
For example, it’s possible that the Y2K computer bug actually did pose an existential threat to civilization, and that we stopped its effects from materializing.
Another example in recent history was the 2014 Ebola outbreak in Guinea, Sierra Leone, and Liberia. This outbreak got out of control, cases were starting to pop up in other countries, and every case required several medical professionals to (attempt to) treat it well. As a rough estimate, if 1 Ebola patient required 5 medical professionals for adequate treatment, and if medical professionals were at medium risk of becoming infected, the biggest risk was that at some point there would not be enough medical professionals, and the system would collapse, followed by social and economic chaos.
So, how did the world avoid that outcome?
Doctors and scientists stopped the Ebola outbreak with public health measures, vaccination, and luck: A vaccine had been developed before the outbreak but hadn’t entered human clinical trials. Most medicines in Phase 1 human clinical trials fail. But when health organizations decided to try this vaccine, it was almost entirely effective, and there were no serious adverse events reported. This vaccine hit it out of the park on the first try. Today, no one talks about the 2014 Ebola outbreak. But it’s possible that, without the vaccine, we’d be in a very different place.
With Y2K, armies of COBOL programmers came out of retirement, and some younger programmers also learned COBOL, all to perform the tedious work of re-coding software to handle four-digit years instead of two-digit years. The risk was that if software using a two-digit year like ’00’ for 1900 or ’99’ for 1999 tried to roll over to the year 2000, the software would register the year as 1900, thereby affecting data output, confusing program logic, or causing software crashes. That might not be too big of a deal for a hobbyist’s spreadsheet program, but it could be a huge deal for the software that runs the power grid—and for everyone depending on it.
All these programmers collectively updated enough date fields that, when the clock chimed midnight on January 1, 2000, nothing much happened. Some people theorize that nothing much would have happened even without the Y2K effort, but that’s the rub. Was Y2K a real systemic risk or a nothingburger? It’s hard to tell, because the victory (if there was one) was silent.
The phenomenon of silent victories fuels naysayers who then argue against mitigating future problems. And that’s a real systemic risk. Take climate change deniers.
Are there any of those still left?
Yes, though as extreme climate events pile up, naysayers dwindle. But their pushback against climate change efforts has delayed mitigation by decades. We’re now at a stage where we can no longer prevent some terrible effects of climate change. It would have been far better to risk naysayers’ ridicule and dive in earlier and stronger.
Of course, climate change is a much more difficult problem than the Y2K bug. We can’t just re-code some date fields in software. It’s also a much larger problem than the Ebola outbreak was, since it’s happening in many places at once and there’s no single solution.
In order to have any hope of a successful climate mitigation effort, a few principles can serve as a guide:
Human behavior isn’t generally helpful in mitigating global risks: In an emergency, people act to preserve their own lives, health, well-being, jobs, and families. That’s understandable. But any assumption that people will collectively pare back spending, traveling, or earning on their own initiative is naive. They will not, unless governments offer massive individual incentives and—importantly—find a way to regain people’s trust.
People need to see results fast if they’re sacrificing: During the Covid crisis, when lockdowns didn’t stop the virus, that damaged many people’s trust in mitigation efforts. It’s not entirely the fault of governments and public health departments—the virus evolved over time and eventually became too contagious to contain—but they do bear some blame in not taking more action at a time when the virus was less contagious and could have been stopped, in January and February 2020.
Scientific and technical breakthroughs are our only hope: This has often been the proposed or actual solution to large-scale crises. In the Y2K crisis, it was tedious correction of old computer code. In the Ebola outbreak, it was a new, never-before-tried vaccine. Similarly, with climate change, programs to develop ways to transport solar energy could help, if implemented at scale and with vigorous commitment and funding. (It would be fairly easy to produce the solar energy; the challenges are reducing toxicity of manufacturing solar panels and batteries, increasing storage capacity, and figuring out transmission or transportation over long distances).
People need to get paid (very!) well for their work on the endeavor: The COBOL programmers who came out of retirement to mitigate the Y2K bug did that because they were paid a lot of money! It’s true that, with Ebola, we got lucky. But relying on luck is a poor strategy. Instead, working on the big problems facing society should pay better than workers’ BATNA (best alternative to negotiated agreement, in management-speak). Jobs focused on mitigating climate change should be highly paid—even subsidized by government through tax incentives for companies and individuals—so compensation can compete with Silicon Valley software developer and engineer comp. Bright, innovative people need to choose this work and make excellent livings in order for it to be sustainable and eventually successful.
I’m catching some undertones of optimism here.
Risk management, on some levels, requires a pessimistic valence: you spend your days wondering what could go wrong. But even more, it requires out-of-the-box optimism, because you spend even more of your days exploring how to mitigate potential problems—and pushing to change structures and strategies to make it happen. Lastly, it requires humility and perseverance, because while failure is obvious, success is often invisible or subtle. Disasters are often averted quietly.
And then the next day begins.
-<>-<>-<>-
Extra, Extra!
Three links from the depths of my bookmark archives; think of these as tangential extras for curious readers:
1. En-ROADS Climate Simulator - from MIT and Climate Interactive - play with the simulator and see for yourself the impact of changes in energy use on future global warming. We should have acted earlier, and we still need to act now.
2. USA has the world’s most extreme weather - by Doyle Rice in USA Today - the US has about 80% of the world’s tornadoes!
3. The panic attack of the power brokers - by Andrew Rice in Curbed - commercial real estate’s dilemmas. The Financial District adjusted after September 11, 2001. Can midtown adjust in 2022?
I remember Y2K all too well, being tangentially involved in upgrading legacy systems back in the day. I have to agree with your entire post and your observation about the wane and failure of public health protections for COVID-19, mainly through a lack of resolve, is particularly on point.
Good post. Climate change is much more than a problem. It's the system itself and not a problem in the system. People need to be made aware about it. Not just told. A virus is bad. Yes people know that. But climate change is what? Governments will need to make people understand what it means and how dangerous it is.
Nevertheless, an encouraging post.