Playing the risk game (1 of 2)

June 21, 2010

NOTE: Images in this archived article have been removed.

Image RemovedBP’s oil disaster is partly the result of approaches to risk which lead us to believe we know more than we do.

It’s hard to know where to start with the BP oil disaster. Commentary has been gushing out almost as quickly as oil. We know the scale of the pollution, and have read ecologists who say the conditions are unlike any other seen on earth, certainly in the anthropocene period. Dark humour is one response; angry satire is another (The Onion: ‘Massive Flow Of Bullshit Continues To Gush From BP Headquarters‘).We have a good idea that BP’s conduct over the drilling was somewhere between careless and reckless (and that sooner or later a court is likely to decide on which), and that regulatory agencies were compliant or ineffective. One area that seems to deserve more thought – especially from a futures perspective – is the way in which essentially man-made disasters such as this are to a significant extent produced by a limited set of ideas about risk, both the way it gets assessed and the way it is managed.

In an interview this month, BP’s chief executive Tony Hayward called it a “low-probability, high-impact accident”. Elsewhere he’s quoted as saying it’s a “one in a million” chance. For the moment we’ll park the word “accident”, which implies that there’s no culpability anywhere along the line, or the finger-in-the-air “odds”, which are themselves at odds with a surprisingly long list of deepwater drilling near misses in US waters. But language like “low probability, high impact” would put the event in the territory of what (at least historically) futurists would have regarded as “wild cards” – events which are quite unexpected but have huge consequences.

We know less than we think about risk

Why this is problematic is because it implies more knowledge about the nature of risk than we usually enjoy. The diagram below is taken from a presentation by Gary Kass of Natural England to the UK Horizon Scanning Centre’s ‘FAN Club’.

Image RemovedThe axes are straightforward enough: vertical is ‘knowledge of outcome’ running from ‘low’ at the bottom to ‘high’ at the top, while horizontal is knowledge of probability, running from ‘low’ at the left to ‘high’ at the right.

The whole edifice of the risk management business is squashed into one quadrant in the top-right corner (which leaves an awful lot more that we know little or nothing about). Where we know about probability, but not outcome, then we move into the realm of ambiguity (and conventional risk calculations don’t work). In the top left, where we have some understanding of possible outcomes, our knowledge of probabilities is limited. And in the bottom left, we know nothing, either of outcomes or probabilities. We are ignorant.

But because “risk management” has become such a dominant idea and set of practices in the late 20th century, for reasons which I’m not going to try to disentangle here, issues which shouldn’t be in the domain of the risk managers are squeezed into it by organisations. Hayward would say that BP’s oil disaster is an upper right quadrant event because he knows both probability (low) and impact (high). In fact, the first order outcomes (oil in sea threatening ecosystems, technical challenges with managing the rig failure) are at best in the area of uncertainty. The potential second order outcomes (economic destruction of other businesses dependent on the Gulf, possibility of ‘black rain’ in the hurricane season) seem to have been in the bottom left quadrant, where ignorance resides. Or to put it another way, the relatively limited analytical tools of the risk management business are extended to areas where they are all but useless.

Knowing and not knowing

So we end up using the wrong tools for the task at hand. As the amount we can reasonably know about a possible future question diminishes, along with possible understanding of the outcomes, we are in the realm of ontology and epistemology, of being and knowing. And Gary Kass’ model maps pretty well onto the model developed by Sohail Inayatullah, about what we know and don’t know, in his book Questioning the Future.

Image Removed

The ‘realm of risk’, above, corresponds to ‘What we know we know’, while the ‘realm of ignorance’ corresponds to ‘What we don’t know we don’t know’. (Pause for Donald Rumsfeld moment: although he was mocked it wasn’t a foolish thing to say).

Of course, organisations tend not to have a Director of Epistemology on the Board. Instead, current tools, and dominant models of the world are deployed even as some start to realise that they are no longer fit for purpose. This is because the traditional model of risk management – over-professionalised though it is – covers only one quadrant of these four.

Black swans and blindspots

And this inevitably takes us to the realm of the ‘black swan’, Nicolas Taleb’s construction for shocks and surprises which appear as if from nowhere, and which was first described in his book Fooled by Randomness. I’ve criticised his concept before. Most so-called ‘black swans’, such as the perennially cited 9/11 attacks, turn out to have form; similarly, the banking collapse of 2007-08 was well trailed, but not by bankers.In short, in futures work, there are inevitably trails of clues anticipating the discovery of a black swan, and a whole set of methods (such as emerging issues analysis) to help interpret the clues. ‘Black swans’ are surprises only because people have been looking in the wrong direction, or not looking at all.

In its latest incarnation, the idea of the black swan now seems to be about ‘low impact high probability events’, which seems unhelpful; we have methods for thinking about these already, even if they are unreliable. Instead, the value of the black swan should be that it focusses our attention on our blind spots, on the way in which our assumptions about the world obscure us to parts of it which don’t fit with out worldview – or with our self-interest. There are always more things in heaven and earth than are dreamt of in any of our philosophies.

Instead of thinking about “risk” – which almost always involves thinking about the familiar, and in familiar ways – we need to be thinking about corporate ignorance and the things that we don’t know that we don’t know.

This is the first of two posts on this subject. The second will appear here shortly. The picture at the top of the post, of the Deepwater Horizon rig, is from the Gulf of Mexico energy news service OCS BBS, and is used with thanks.

Andrew Curry

The Next Wave is my personal blog. I use it from time to time to write about drivers of change, trends, emerging issues, and other futures and scenarios topics. I work for the the School of International Futures in London. (Its blog is here). I started as a financial journalist for BBC Radio 4’s Financial World Tonight, before moving to Channel 4 News during the 1980s. I still maintain an interest in digital media and in the notion of the creative economy.

Tags: Consumption & Demand, Deepwater Oil, Energy Policy, Fossil Fuels, Industry, Media & Communications, Oil, Technology