"The chances of an airplane having a bomb on it are very small," he reasons, "and certainly the chances of having two are almost none!"
It's interesting how we view risk. I mentioned in my last blog that perception of risk and the actual probability of an event happening are often (usually) not the same thing.
There is a phenomena called the Monte Carlo Fallacy, also known as the Gambler's Fallacy or phenomenon. Basically this is where there is some event that is truly or close to random , like the lottery or tossing a coin and predicting whether it will turn up heads or any other similar random event. The phenomenon is that often people will play such games and believe that every time they play and loose, each loss brings them one step closer to winning. So if we look first at a gambling example and then widen it out to more everyday events.
Take the lottery - any national lottery where you have to pick a series of numbers and if your numbers come up you win. In the UK the lotto works out at about 14 million to one against winning the big prize. What happens is that people keep playing the same (lucky) numbers every week in the belief that every time their numbers don't win they are one step closer to winning - next week maybe. The reality of course is that every week each selection of numbers still has a 1 in 14 million chance of winning and that stays that way no matter how long you play. There is no reduction in this chance what-so-ever as the numbers that won last week have exactly the same chance of coming up as any others do. It makes no difference what numbers you play you are still unlikely to win.
The same applies to other large and random(ish) events like air crashes for example. The fact that a plane has crashed already this week and on average only a few crash a year means that I am safe on this flight is nonsense. Ok, that's not much comfort if you are a nervous flyer but it is realistic at least. There are better indicators of air worthiness like maintenance schedules etc. however they don't fully account for the random chance events of a series of hitherto unknown issues coming together at some particular time. The stock market is another example. Past performance is not a guide to future performance. How many times have you heard that? It is true, however people still look to past trends to inform future decisions in situations that are random or as close to random as doesn't count. It is sort of hard wired into us.
This is the other side of risk aversion. As mentioned yesterday we are more likely to be risk averse if there is a perception of potential loss as opposed to a perception of potential gain.
Because things don't really happen in three's (Sorry!) and because what we believe is not always true, having an appreciation of the psychology behind risk behavour and thinking starts to help especially during events that are uncertain like organisational change for example, which is often a really good instance of ambiguity plastered over and made to look rational. We never truly know what the effects of re-engineering an organisation will be even if we believe we do. We can have a good guess but it is not guaranteed and it is a lot more ambiguous than most OD professionals would like to admit.
In the next blog I will explore what is called the 'representativeness heuristic' which sheds some light on why people engage in the Monte Carlo Fallacy and why risk averse and risk taking behaviour is an important issue, especially in times of uncertainty.