Our universe is only about 5% light matter (aka matter we can observe). It’s about 27% dark matter, which we can’t directly observe. The rest—68%—is dark energy, which is even more mysterious.
Today seems like a great day to talk about the “dark matter” of risk: the risks you don’t see in routine reporting and in normal times, the ones lurking beneath verdant green risk dashboards or between well-defined categories, waiting for the right confluence of circumstances to pop out like scary Halloween jack-in-the-boxes.
Can we surface these risks?
Sometimes. Without omniscience, there will always be things you’ll never see coming, since subtle risk can manifest unpredictably, like a wildfire sparking to life due to a convergence of conditions. But here’s one possible strategy to increase the chances of identifying non-obvious or subtle risks.
Step 1: Read what’s available to you. That might be annual and quarterly reports if you’re dealing with a public company. It might be conference talks, academic papers, blogs, and media interviews if you’re dealing with a private company. In the movie The Big Short, it was reams of mortgage-backed securities: only a few people bothered to look under the hood at the individual loans. If you’re internal to a company, it might be risk dashboards and Powerpoint presentations.
Step 2: Ask yourself if the information you read matches observed reality. Sometimes things sound good on paper, but the implementation is rocky, like a startup pitch deck for a breakthrough that’s really just a clunky app. Even if the data indicates potential problems, it can be worthwhile to do a reality check, like one team of bankers did in The Big Short when they visited neighborhoods and spoke with homeowners and realtors before making a final decision about what to do.
Step 3: Given all the data that’s reasonably available to you, ask yourself what questions you have that aren’t answered by that data. You don’t need to find answers to these questions immediately or at all, but it’s worthwhile to know what questions you have, so you can decide how comfortable you are with the unknowns. Bernie Madoff’s data looked great—a little too great, which should have raised questions about the unknowns (for example, how is it possible to achieve this consistency of returns in both rising and falling markets, given the stated portfolio?).
Step 4: If you’re not comfortable with the unknowns you’ve identified, increase resilience. If you’re internal to a company’s risk management team, you might request data and propose a new metric to surface previously hidden risk for remediation. If you’re an investor, you might reduce, hedge, or exit a position. If you’re a consumer, you might switch to a product from a manufacturer you trust more. Whether you’re conscious of it or not, whatever your role, you’re constantly assessing known and unknown risks and adjusting your own stance to match your risk tolerance and risk appetite (aka your comfort level). This is a skill you can practice and hone.
So, only the paranoid will survive?
Maybe not even them. For starters, almost every environment features information asymmetry: at any given moment, someone else almost always has more information than you. Also, no matter what level of data access you have, no matter how observant and cautious you are, eventually you’ll be surprised by something.
But by increasing resilience and staying flexible and open-minded, rather than taking popular narratives at face value, you’ll be better positioned and better prepared to respond when hidden risks manifest. And, when dealing with the dark matter of risk, that’s a reasonable outcome.
Tangential extras for curious readers:
1. Revealing Hidden Risks: Tools for Enterprise Risk Management - by Jeanne Fallon-Carine and Gretchen N. Hancock in The Synergist - risk perspective published in the journal of the American Industrial Hygiene Association.
2. Hidden risks and biases - by Rikard Lundgren and Linus Nilsson in The Hedge Fund Journal - finance nerdy but a good read.
I liked your analogy to matter, dark matter, etcetera. The topic reminds me of Secretary Rumsfeld during the Iraq War trying to characterize to reporters the fog of war through the use of knowns and unknowns. For me the big takeaway from The Big Short and a number of Michael Lewis' books is that human beings are not good at assessing risk. This is more due to the fundamental nature of how our minds work. Training can help but its just not how we are built. This seems to be the lesson of every asset bubble. Outside of finance it emerges only recently as humankind now pursues making things of inordinate complexity. Whenever that is done, it becomes natural to simplify (dumb down) what can go wrong into manageable bites. I enjoy your writing.