Massive Risk Management
Governor Schwarzenegger made an announcement on Monday. He’s withdrawing support for a planned offshore oil project in California state waters. He was very clear: this decision was made specifically because of the Gulf oil leaks.
“I think that we all go through the endless amount of studies and research and everything, and before you make a decision like that, you are convinced that this will be safe,” the governor added. “But then again, you know, you see that, you turn on television and see this enormous disaster and you say to yourself, why would we want to take that risk?”
We have a hard time planning around risks that have low probability but potentially massive impact. Most risk assessment is done intuitively, and our intuition gets fickle around long-tail events. Our gut instincts differ person-to-person, and also perhaps within ourselves. Somehow I can never be bothered to wear a helmet when I get on a bicycle, but when riding a motorcycle in states without a helmet requirement, the idea of taking mine off strikes me as absolutely insane.
Formal cost-benefit analyses can be used to mathematize planning around uncertain outcomes, and they often are. But CBA can lead to especially extreme cases of garbage-in/garbage-out, and usually does. What number do you assign to the “cost” of a species loss, for example. And how would an equation help if you didn’t know what the probability of a species loss was anyway? Laplaces’s insufficient reason criterion can be used to hold together these shaky formalizations, but that criterion states that if you can’t guess the outcome, insert a 50/50 chance of it happening. Which makes intuitive sense I guess, but here we are at intuition again.
Intuition is sensitive to recent circumstances. And so, because of the timing of the “pictures on TV”, California won’t have offshore drilling. I’m sympathetic to the governor; I’m sure he was shown credible evidence that safety standards in oil rigging have been much improved. But how much safety is enough? It depends what’s on TV at the moment.
We seem to collectively deal with a lot of these low-probability/high-impact decisions. I think they’re some of the most important choices societies make (whatever that means). For example, the question “how to deal with the threat of terrorism?”, is premised, often invisibly, on the question “how much of a threat is terrorism?”. Is the attempted Times Square bombing proof that Americans are living under threat? Or is it a reminder that American citizens are remarkably safe from home front terrorism? When the potential consequences are so important, declaring something irrelevant because it’s out-of-the-ordinary doesn’t seem right somehow. And yet, and yet.
Or how about that crazy climate change? Critics suggest that because we have uncertainty in the outcomes — which we absolutely do — we shouldn’t be pouring resources into combatting an unknown. Which isn’t so crazy, if you consider the opportunity costs: the money and time and political capital we spend keeping carbon out of the atmosphere could be going to plenty of other deserving projects. But my intuition tells me that the uncertainty associated with climate change is precisely the reason we should fear it. I worry that we’re going to learn too late the value of a predictable climate. Each specific climate-linked tragedy may be unlikely to the point of absolute unknowability, but somehow that collection of unknowable tragedies sounds like the worst thing in the world to me.
I have a hard time articulating that threat to myself or to others, but the precautionary principle speaks to it. According to wikipedia, the principle states that
“if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is harmful, the burden of proof that it is not harmful falls on those who advocate taking the action.”
Environmental systems are weird. They often seem to be complex in the academic sense, behaving in aggregate in ways which can be either resistant to perturbation or suddenly highly sensistive to it. Formal complex systems theory usually isn’t very good at predicting outcomes in environmental systems (although I think it’s fabulous at helping us to understand why we can’t make those predictions). Ecologies are weird and unknowable, but they are also crucial to our lives, both in big ways and small ways. We will all die if the ecosystem services we rely on are thrown out of wack, but we will all be miserable and grumpy long before those services completely collapse. That combination of complexity and cruciality makes predictions around unlikely but potentially significant environmental dangers especially perplexing.
I’m not usually a small-c conservative, I tend to value experimentation and liberal politics. But because of the particularly fraught nature of environmental choices, I’m a big believer in that precautionary principle.