“Every captain should go down with every ship — every captain, every ship.”
In his bestselling book The Black Swan, Nassim Nicholas Taleb examined how improbable events with massive consequences — Black Swans — shape our world, and why most of us never see them coming. Fortune’s Brian O’Keefe asked the distinguished professor of risk engineering at NYU-Poly to help us derive some lessons from the accident at Fukushima.
In 2003 the Japanese Nuclear Commission said that a fatality due to radiation exposure from an accident at one of its facilities should happen less than once per million years. That was the standard by which they were managing the reactors. Even if we don’t have casualties yet from this accident, we know the nightmare scenario almost happened eight years into the million years.
This is what I call the criminal stupidity of statistical science. These models can tell you something about normal events, but they cannot deal with unexpected, high-impact events. Some guy probably measured the risk according to a formula and said, “Well, it meets the one-in-a-million standard.” But we are incapable scientifically of measuring the risk of rare events. We tend to underestimate both the probabilities and the damage.
Because the world has become so connected, a crisis today quickly becomes global. If the nuclear situation in Japan continues, you’re going to see a major disturbance of supply chains. Our connected world appears to be more efficient. We have economies of scale, and goods seem to cost less. But when there is a disturbance, the setback is much harder to handle. Not only are we building riskier systems, but also the risks involved in failure are a lot larger.
Governments need to present the risks we face more accurately. Risks can be framed in a certain way. When someone tells you something could happen once every 1 million years, you’re going to take that chance, thinking the risk is periodic and you are safe in the short term. But really it’s like playing a game in a casino. The risk of the event doesn’t change from Monday to Tuesday.
Two years ago I published an editorial outlining 10 steps for a robust Black Swan society. One of the rules is to eliminate what I call the “agency problem.” That’s when someone else is making money and you bear the deferred risk, and it’s exactly what we had with bankers in the financial crisis. They make the bonuses, and we pay the cost when they blow up. Every captain should go down with every ship — every captain, every ship. Follow that rule, and we’ll live in a much better, safer world. And if a tenured academic makes a miscomputation, he should also go down with it. The government supervisors should too. Everyone involved in the chain should be responsible — not just the operator of the plant.
What’s next for nuclear power? Six experts weigh in: