By Dave Huitema- It has been a couple of years since Nassim Nicholas Taleb published his book “The black swan. The impact of the highly improbable“. The book is amusing reading, if only for the personality of the author who is opinioned on just about anything. This includes the question whether an expert is really an expert (as in he/she knows more than a lay person- a political scientist does not says Taleb), on whether one should read newspapers (better not), risk management in casinos, and the value of yearly personnel evaluations (better not to have them). The key messages of the book however relate to highly improbable events with large consequences and as such it is very relevant for water management, as I think was underlined by the recent Japanese tsunami and its impact on the Fukushima nuclear power plant.
Some of the key insights one can take from Taleb’s book relate to the way risks are frequently calculated. He launches a blistering attack on Gaussian statistics, which are essentially based on a bell curve. This curve assumes that most events (such as water level fluctuations) will vary close to an overage, and also that deviations from the average become increasingly rapidly unlikely.
Taleb shows how this line of thinking applies in what he calls “Mediocristan“, but not in what he calls “Extremistan”. Parameters that confirm to Mediocristan-logic include the heighth of human beings – where the values can safely be assumed to closely vary around an average and huger deviations are very rare and beyond a certain limit impossible. Thus we can safely assume that the chance we will encounter a person of more than 3 meters tall are zero. However, Taleb also shows that other parameters are actually from Extremistan, and there is little reason to assume that extremely large or small values are so unlikely to occur. Extremistan logic applies to academic fame – because it is usually just a select number of experts who are quoted by everybody, but also to book sales and to economic crisis. To give a concrete example: if the average book in the English language sells 200 times, the bell curve would suggest – in the absence of other of blockbusters – that the sale of 450 million copies of the Harry Potter series is so unlikely that one can ignore the possibility. Until it actually occurs of course.
The key insight here is that prediction on the basis of the bell curve will lead to an underestimation of the likelihood of extreme events, if these extreme events do not belong to the order of Mediocristan but that of Extremistan. Taleb applies his thinking to economics, where he shows how models have underestimated the possibility of large bank failures that simply could not happen until they did. I would not be surprised if Mediocristan logic also underpinned elements of the design of the Fukushima nuclear power plant. Its design was made such that it could withstand an earth quake of a certain magnitude, and a tsunami of a certain height. I am certain that risks calculations were made to assess the likelihood of more extreme events and that such calculations showed them to be unlikely (perhaps impossible), until of course they happened.
The question is of course how does one know the difference between parameters that belong to Mediocristan and Extremistan. This might be quite difficult to determine actually, so perhaps the rule of thumb should be that if the consequences of an event are potentially enormous, it might be safer to assume that we’re not dealing with the bell curve type of risks.
Can we read Taleb’s advise then as a plea for the precautionary principle? It appears not to be case. Taleb writes how in his personal investment portfolio, he adopts an extremely conservative approach for 80% of his money, and an extremely aggressive risk taking approach for 20% of his funds. As such this sounds like an appropriate solution, but one is curious how this translates to societal choices when it comes to energy production. Would that mean 80% power generation by conventional coal or gas fired plants, and 20% from untested technologies?
Another issue I found myself wondering about the prospects for learning. In his fascinating account, Taleb shows how some of the worst black swans wipe out the evidence of their own existence. They can be so devastating that none of the victims survive and no one is able to relate that the event took place (silent evidence) and how it came about. Subsequent generations will therefore possibly not consider the exact same event very likely. There is another aspect to this, which is that we tend to listen to the survivors (if they exist) and attribute their success to certain smart choices that were made. However, in many cases there is an element of luck in the survival of a black swan event, and the stories of those who made the very same smart choice but actually didn’t survive will not be heard. The consequence is that we might actually be learning the wrong lessons from black swans, as the survivors were simply lucky and did nothing special.
As a consequence we should be much more focused on failure than success, in line with the philosophy of science that was developed by Karl Popper. One of the best stories that is told in the book for me is the one about the casino that literally paid millions and millions of dollars to keep an eye on the visitors of the casino. Surely they occasionally caught cheaters. The real losses however were located completely outside the view of the security system. An employee not filling certain reports to the tax authorities, the kidnapping of a family member of the casino’s owner. Can such black swans be avoided, and can be make sure we profit from their logical opposite the white swans – unexpected events that bring us windfall profits? Taleb suggests that this is not possible, but that black swans can be made grey by critical thinking and not applying Gaussian statistics. What this could mean for water governance is still an open question to me, but a nice one.