Management of BP called the Deepwater Horizon tragedy a “Black Swan”. Investigations into the incident assign root causes to failure of management systems. This knol concludes that the event was indeed a Black Swan, but that such a conclusion does not obviate the need for improving process safety management systems.
- The event is a surprise.
- The event has a major impact.
- After the fact, the event is rationalized by hindsight, as if it could have been expected.
With regard to Deepwater Horizon, of the first two elements listed above, there can be little discussion; the event was a major surprise, and its impact (loss of life, environmental, economic) was huge and will be felt for years.
The third element – rationalization by hindsight – has also occurred (and will continue to do so), as it does following virtually any major incident. For example, the report of the President’s commission (download) on this event contains the following quotations:
The most significant failure at Macondo—and the clear root cause of the blowout—was a failure of industry management.
Better management of decision making processes within BP and other companies, better communication within and between BP and its contractors, and effective training of key engineering and rig personnel would have prevented the Macondo incident.
BP, Transocean, and Halliburton failed to communicate adequately.
The report is saying, in effect, that risk is under the control of management, and that if the management systems are strong enough then events such as DWH will not occur. Consequently, it can be inferred, Black Swan events are avoidable.
Two important economic drivers in the process industries are: (1) Make equipment and facilities large; and (2) Move quickly.
Relevance of the Topic
The above discussion as to whether catastrophic events such as DWH are Black Swans may seem to be to merely academic, but it does have deeper implications. By attributing the causes of events such as Deepwater Horizon to “failure of industry management” there is a tendency to believe that, had those management systems been improved then they would not have happened.
Now it is certainly true that better management systems will reduce the likelihood of such events repeating themselves, and will also reduce their consequences should the worst happen. Indeed, the implementation and improvement of management systems is the theme of most of the Sutton Technical Books web site. But this attitude can create a feeling that Black Swans can be avoided altogether – an idea that is in direct contradiction to Taleb’s thesis.
Preparing for Black Swans
Given that a Black Swan event is, by definition, impossible to predict, then it would seem that managers in the process industries cannot prepare for such events. However, there are actions that can be taken. These includes the following:
Invest in Emergency Response Systems
Invest in Emergency Response Systems
One reason that the Deepwater Horizon event was so serious was that industry did not have plans or equipment in place to handle such as event. The industry was caught flat-footed. If it is accepted that catastrophic events can occur “out of the blue” then it makes sense to invest in preparing for them.
Be Wary of Risk Predictions
The inadequacies of forecasting, and the resulting false confidence that is created is a major theme of Taleb’s book. Applying his ideas to the process industries, risk management focuses on the prediction of hazards, consequences and likelihoods and overall risk.
|For example, a HAZOP team cannot discuss multiple-contingency events due to confusion as to “where are we now?”. Yet virtually all major events (including Deepwater Horizon) are caused by many items going wrong at once.
Another example is to do with Fault Tree Analysis. Even the most expert analysts will miss important common cause events, which is why catastrophic events seem to happen more frequently “than they should”. (Which is why inserting an “All Other Events” base event at the top AND Gate of the tree is so useful – it keeps everyone humble.)
The first of these drivers – making equipment and facilities large – reflects the economies of scale that apply to all industries. Given that the capacity of a vessel increases with the cube of its radius, but its cost (which depends on the surface area) increased with the square of the radius, then the bigger an items is, the more economical it is. Other economies of scale include the reduced cost of labor per kilogram of product made.
All these savings are well and good, unless and until there is a catastrophic accident – and then the losses wipe out all the money that was saved (along with non-economic losses to do with safety and the environment).
The second economic driver is the relentless pressure on management to move projects along quickly (this was a factor in Deepwater Horizon). But there is much to be said for developing technology in a paced manner, and for implementing projects more slowly so that there is time to check that was has been done to date is safe and operable.
Think the Unthinkable
In order to stop events such as DWH from recurring, management must recognize that risk analysis is a necessary but not sufficient part of the whole risk management process. Everyone needs to think creatively, and they must “think the unthinkable”.
The analogy with the scientific method is relevant.
The . . . scientific method is a way of testing. It is not a way of coming up with ideas, it is a way of testing them. Intuition generates the ideas. Once you finished using the intuition to come up with some great ideas, than you put the intuition on a shelf and you go and see how it actually works. Because if it does not work, I do not care how marvelous it sounds, it is still a waste of your time. So you need to have a kind of twofold approach, where on the one hand, you write with your intuition running, come up with all kind of ideas and then test them to see what kind of response do you get. Does it work? Does it flop? You will know.
If the phrase “scientific method” is replaced with “risk analysis”, the approach described above by Greer can be used to help identify and understand future Deepwater Horizon events.
Management of BP called the Deepwater Horizon tragedy a “Black Swan”. Yet the investigations into the incident assign root causes to failure of management systems. The above analysis concludes that the event was indeed a Black Swan because all three elements listed at the start of this page are satisfied.
This conclusion is not an excuse for fatalism, nor for failing to take the appropriate risk management measures. (“It was a Black Swan, so you can’t blame us”.) However, it does mean that, in order to prevent such incidents from recurring, fresh ways of thinking are needed. In particular, the well-established and successful risk management techniques could be allied with creative and imaginative approaches, maybe using insights from non-technical disciplines.