Friday, February 29, 2008

Ambiguity in Organizations

Click on the graph to make it bigger.

Again some interesting comments from Christine

Here are the approximate distributions of the modes of leadership. As you can see mode 4 leaders are in the minority. (modes 0 and 5 are theoretical at the moment as the research hasn’t been completed on these - these are the latest research distributions however only 1 -4 are accurate populations).

In terms of ability to deal with ambiguity, roughly the closer to mode 4 you get the better things get. Like everything in life every upside has a downside. The downside of this is that the closer to mode 4 you get the less methodical people are and they really don’t like stability too much. So at the moments an organisation needs change these are the guys, however the moment an organisation needs to just settle down and have a period of stability then mode one and two are the people to help here. The approach we take is to give people, particularly leaders the ability to operate in every/any mode depending on the situation and the outcome desired. This is what gives the leaders (and their organisations) their agility. Being agile usually means better decisions and more flexible thinking.

The point for me in developing (at least) tolerance of ambiguity in a wider population is that without it people’s decision making is usually impoverished. By this I mean that if they are reacting emotionally in a knee jerk way to uncertainty or risk for example, they are not usually making great decisions. Developing ambiguity acuity equips people to think clearer, make better decisions, behave better, it enhances problem solving etc. particularly, but not exclusively, in difficult and shifting situations. The very situations others spend most of their time avoiding or denying. They certainly outperform colleagues who don’t have much emotional resilience in a wide range of leadership tasks. These are also the people who will take a risk and try new things.

As you start to increase the numbers of people in an organization who can, as a minimum at least, cope with ambiguity the more agile the organisation becomes, the easier it finds it to navigate difficult times and find advantage where others are struggling. Such organisations also adapt to changing conditions quicker and with a better fit. This is why we do the work we do in companies and organisations. It makes them successful right at the time others are having it tough. To come out of a tough time like a recession for example in great shape, being innovative and having found new markets or other advantages during the difficult times, is like having a spring board into a new world, when others are still just looking up at the board wondering how to get up there.

In short - deal with ambiguity better and you and your organization become more agile, competitive, and swift.

Thursday, February 28, 2008

Competency, proficiency and capability in ambiguity

I am currently teaching at Cardiff University where an interesting conversation was sparked about the difference and similarities between competency, proficiency and capability. Considering the last two blogs about developing ambiguity competency I thought that this might prove to be an interesting discussion for the blog. This might get a little convoluted but it is worth sticking with.

Capability: This is usually linked to the terms capacity and ability. So generally speaking in this context to say someone has the capability to deal positively with ambiguity usually means that the individual has the ability or skill and the capacity to deal well with ambiguity. However it does not mean that the person in question will actually do so. The have the capability, however if they don't have the desire to use their skills or contextual factors suggest that using their capability might not be ideal then the capability is unlikely to be realised.

Competence: This normally suggests that an individual has the required skills and knowledge to do something, in this case handle ambiguity. Now there is a question as to whether a person with a competence in say dealing with ambiguity is actually a competent person! Just because someone has a competence in an area does it necessarily mean that they are what what we would generally recognise as being competent? Whilst they use the same word one (having a competence) might not lead to the other (being seen as being competent). Go figure. You could argue that they should but do they in reality? This suggests that being competent takes more than having a series of competencies. One of the things that differentiates competencies from competence if the factor of context. A person would be described as being competent in dealing with ambiguity or risk, for example, if they appear to deal well with these in a wide range of situations and contexts, especially when the going gets tough.
Another factor in being competent is agility or flexibility; the ability to change and develop the competence in the light of new situations and thinking. Competencies therefore have levels leading up to competence that can be shown in almost any situation regardless of the degree of difficulty encountered.
So what about proficiency? This might well be what we would suggest competence is. To be proficient in something suggests an advanced level of competency, it suggests expertise.

So we can be capable and yet not actually use the capability, we can have competencies and yet not be competent in the area in question. However if we are proficient we mush be competent, have the required competencies and have the capability. Simple really!

So what does this mean for ambiguity and risk?
Many have the capacity to deal with ambiguity well and make great decisions, but don't.
Some may have the capability to find the advantage in ambiguity but don't realise it.
A few may have competencies in the areas of ambiguity, like emotional resilience, critical thinking, creativity etc. and yet may not be competent in ambiguous situations.
Only a few (largely mode 4 individuals) are proficient when it comes to dealing well and finding the advantages in ambiguity and risk.


Wednesday, February 27, 2008

Ambiguity Competencies II

This this the second part of the last blog. I will continue to cover the issues that Christine raised about ambiguity competencies.
3. Leading others in times of ambiguity
There are a series skills, thinking, attitudes and behavours that are specifically required when people have to lead others in times of uncertainty. These include (but are not limited to) the following:

  1. Creating a compelling and real vision of a required future state for people to move into - there are specific methods and tools for this.
  2. Intrapersonal ability - being able to monitor your emotional state and manage it this includes:
    1. Emotional resilience - the ability to move quickly out of one emotional state into another and change the cognitive frame being applied.
  3. Interpersonal ability - connecting with others, empathy and being able to manage other peoples emotional states
  4. Agility - the ability to see change and move with it, and respond accordingly. This is a key competence that has a series of sub skills, thinking, attitudes and behaviours associated with it. This is a major differentiating attribute that mode 4 leaders have.
  5. Decision making in ambiguous situations where the outcomes are uncertain. There are some very specific methods are available here.
  6. Problem solving, especially where data is missing, incomplete or contradictory. We are currently preparing a series of online workshops around this.
  7. Generative (Creative and innovative) practices.
  8. The ability to use diverse resources and thinking frames without prejudice.
to name but a few. The aim here is to have agile and flexible leadership that can see change as it happens, change it's approach when necessary to suit the situation and maintain progress, including counter intuitive moves.

4. Developing ambiguity tolerance / resilience in organisations
This requires a special strategic approach that denotes multiple pathways and outcomes. Most strategies have one future and one pathway. In order to develop tolerance to ambiguity, the ability to profit or advantage from ambiguity and increase organisational resilience takes more than just more plans. Just look at the pickle FEMA got into over the Hurricane Katrina disaster. Their problems did not stem from a lack of plans or planning. Organisations that are agile, and can adapt quickly require a different mind set and a different type of strategy.
Again there are different sets of competencies and capabilities required for this than are present in most frameworks.

One further word. Any such competency or capability framework needs to have a direction (and a level of ambiguity) that produces the emergent properties required, i.e. the ability to deal with ambiguity positively, find and capitalise on the advantages in every situation and develop agility, excitement, energy and resilience.

Need more? Contact me

Monday, February 25, 2008

Ambiguity Competencies

The comment Christine left was on the last blog was really interesting. It raises some very important and practical issues about using ambiguity for development in organisations.

Oddly when I ran a department at Cranfield University I used to research and teach competency development as one of my areas of interest. I have come across a couple of competencies for ambiguity I would agree with you that just about all of them miss the point or help others like managers to miss the point!

For me there are a couple of important issues here which broadly fall under four broad headings:
  1. Creating ambiguity for advantage
  2. Dealing with ambiguous situations to gain the advantage
  3. Leading others in times of ambiguity
  4. Developing ambiguity tolerance / resilience in organisations
1. Creating ambiguity for advantage

The first is that creating ambiguity works in certain situations, mainly those where there is an advantage to be gained from doing so. This requires excellent decision making capabilities, or the ability to know exactly when to create ambiguity and when to create clarity, both of which are different but connected cognitive skill sets. There is then the question about how to create differing levels of ambiguity or clarity for the effect required.

When we are working in organisations we concentrate on developing 6 areas of capability that all contribute to the ability to use ambiguity well:
  1. Emotional Resilience
  2. Decision Making
  3. Problem Solving
  4. Critical Thinking
  5. Creative Thinking
  6. Development of Autonomy
2. Dealing with ambiguous situations to gain the advantage

The second is that the major skill in dealing with ambiguous situations is to find the advantage inherent such circumstances, especially when just about everyone else is heading for the hills or a bunker somewhere nice and safe.
This requires a good level of emotional resilience. This is different from emotional intelligence which is also required. One of the things we do know about ambiguous situations is that with the exception of mode four individuals (See Modes of Leadership) they almost always bring about a change in people's emotional state. Heightened emotional states almost always reduce the effectiveness of cognitive operations (thinking). Therefore what happens is that when a person feels that things are ambiguous they will respond in one of a number of ways. These responses can range from total denial, construction of a new reality, attention being placed less ambiguous items, displacement behaviour and so on. Therefore competency frameworks need to look at emotional resilience as a separate (but linked) skill set from emotional intelligence. Interestingly this is where a lot of our work comes from. Helping organisations develop the thinking and skills to profit from ambiguity and part of that is developing emotional resilience.

As a side note here we discovered that skills or competency development programmes have greater impact when the cognitive side of things are addressed. In other words the thinking needs to develop with the skill which is one of our USP's and is based on the idea of modes of thinking which is embeded in the Modes of Leadership model. The reason being is that the system of logic we apply to any situation changes the way we see things and consequently behave or react, which is inextricably linked with our emotional responses. Which is why when we engage people in our Agile Leadership Programme (pdf) we develop all six skill sets at the:
  1. Behavioural,
  2. Cognitive
  3. Belief / attitudinal, and
  4. Emotional levels together.
This holistic approach accelerates the development process and means that graduates of the programme can deal with any situation that occurs, make good decisions are creative and critical and can stand on their own two feet in any situation.

In terms of recruiting similar issues abound.

I will answer issues 3 & 4 later

Sunday, February 24, 2008

Ambiguity Blogs

I have just done a Google search on the term Ambiguity Blog with interesting results. The top blog is one called 'Revel in Ambiguity' subtitled 'glory in the gaps', which I got quite excited about, until I opened the blog. To be fair it does indeed provide a lot of ambiguity - there isn't any! It is a blog of a newly married young woman who appears to be cooking her way to domestic bliss.
The next site 'Making sense of it all - Meow, doesn't appear as far as I can tell, to mention ambiguity or anything vaguely related to it. It contains the musings of a marketing guy who by the looks of his linkedin and face book profiles on the blog spends a fair amount of his life networking (well at the very least making links with a pile of other people). Surely networking is more than a million internet links to other peoples profiles. You are not even networking with the actual people. There is a guy who linked with my profile on Linked in when I had just started. When I looked at his profile he had over a million links! Networking? I think not, it's more like notworking.
Then comes a blog entitled Deliberate Ambiguity which sounded really interesting. It is sub titled ...'musings about philosophy, marketing and even the occasional taxidermy.' Oh Oh - another marketing bod - this time in braces and a tie. This blog is marketing first, marketing second and marketing third. My definition of philosophy must be way out of date.
The comes the Ambiguity Advantage blog - about ambiguity oddly.

That's it!

From there on in (in google) all the rest of the entries are from individual blog writings that mention the word ambiguity in some way, usually as a synonym for being uncertain.
May be people like the idea of ambiguity without having too much of it about. It might also say something about the ambiguity people feel in their lives and that the blogs are a way of disambiguating their lives. It's all a little vague, which isn't a bad place for it to be, maybe.

Saturday, February 23, 2008

The Ellsberg Paradox - Ambiguity Aversion

A great explanatory video of the Ellsberg Paradox has been posted over on the Curious Website. This describes Daniel Ellsberg's (left) famous ambiguity aversion experiment. As I describe in 'The Ambiguity Advantage' most people shy away from ambiguous situations. The masters of ambiguity (Mode Four individuals) on the other hand are very comfortable with and explore ambiguous situations - the very conditions (as the video shows) most others steer away from. In part just being one of a small number of players in any (ambiguous) situation gives an advantage on it's own. However there are specific techniques and more importantly frames of thinking the 'Masters of Ambiguity use. Mode four thinkers rarely fall victim to the decision making and problem solving biases we are currently exploring on the Ambiguity Advantage blog. How come? Stay tuned and all will be relieved.

What happens when authority meets ambiguity?

An interesting series of responses of authority (in the form of the police) to an ambiguous, non-crime situation. This is a great example of mode one behaviour. Enjoy!

Thursday, February 21, 2008

Action Bias in Decision Making & Problem Solving

The blogs have been a little sporadic in the last few weeks as I have been in the Middle East running workshops for a series of universities and agencies on how to develop critical and creative thinking, as well as higher levels of problem solving, decision making, greater levels of autonomy and leadership capabilities in students and employees.


Another factor that alters decisions to make a decision (!) ( or what it is that triggers us to make a decision) and contributes to the decisions we make is a phenomena called action bias. Simply put this means that just about everyone, when faced with ambiguous situations, especially those circumstances associated with risk, gets the feeling that they need to take some action regardless of whether this is a good idea or not. This frequently contributes to misjudgments about when to act (usually too soon or in the wrong direction) and misperceptions of the nature of the problem facing them, which means that people not only make decision too soon but they could often, almost always have easily made a better decision if they had an awareness of the unconscious psychological drivers we have to make decisions.

Simply put action bias states that when faced with uncertainty or a problem, particularly an ambiguous problem we prefer to do something, in fact we are happier doing anything, even if it counterproductive, rather than doing nothing, even if doing nothing is the best course of action. Action bias was noticed by Bar Eli et al (2007) in a study of goal keepers behaviour in soccer games when faced with trying to save a penalty. When they analysed where most penalty kickers place the ball on taking the penalty they found that just over 1/3 of the time they shoot for the middle and the remaining times, just under two thirds they aim for either the left or right corner. And yet when faced with the decision of what to do almost all goal keepers prefer to leap either to the left or the right rather than standing in the middle, where on average they are marginally more likely to save more goals. The thinking behind such a decision is that it looks and feels better to have missed the ball by diving (action) in the wrong direction than to have the ignominy of watching the ball go sailing past and never to have moved. Action bias is usually an emotional reaction based on the feeling that ‘I have to do something’ even if I don’t know what to do.

The same often applies in many other situations. In a study of police officers dealing with minor disorder outside of night clubs in the UK for example, it was noticed that when some (a minority of more mature and often more experienced) officers were present at the scene they were much more likely to be tolerant of minor disorder and hang back and not act. Preferring instead to keep an ‘eye on the situation’ when they considered the behaviour was ‘horse play’ and without consequence to other members of the public. When other, usually less experienced (the majority),police officers witnessed such behaviour they were much more likely to act, engaging with the ‘offenders’ at an early stage of the situations. The result was that where police officers didn’t act, there were fewer arrests, fewer injuries and the situations usually defused itself without intervention. However when officers did intervene early the situations were far more likely to escalate and more people were likely to be sucked into the situation. When the police took action more of a crowd of onlookers usually developed with the result that some of them got drawn into the situation. The police officers who did act early almost all reported that they felt compelled to ‘do something’ and that ‘sitting around doing nothing isn’t an option’.

Action bias frequently draws us into ‘doing something’ when hanging back, observing and exploring the situation for a while is often the best action to take. As you can see action bias can make easily situations worse and is the foundation of a lot of poor decision making in companies and organisations around the world. This is linked to both the illusion of control phenomena and regression fallacy which were the subject of the last two blogs.

It is also worthwhile noting that action bias leads us to jump into developing solutions before we have the problem fully articulated (solutionizing). A subject that has been the focus of previous blogs.

Also there are one or two places left on the March 4th workshop.

Michael Bar-Eli, Ofer H. Azar, Ilana Ritov, Yael Keidar-Levin and Galit Schein (2007) Action bias among elite soccer goalkeepers: The case of penalty kicks. Journal of Economic Psychology Volume 28, Issue 5, October 2007, Pages 606-621

Sunday, February 17, 2008

Illusion of Control

Last week at the Medical Sciences Division (Oxford University) one of the subjects we explored follows on nicely from the last blog, illusions of control. In 1995 Langer wrote a paper in which he showed that many people tend to believe that they can control and change events that are in fact beyond their control. Even during truly random events like the lottery, rolling dice etc. people often believe that they have the skills and attributes to change or influence the events. Such a belief is not confined to individuals. Teams also fall foul of this decision making bias, which because others are involved in the bias, usually removes all doubt of the entire group that they can in fact influence events that when examined somewhat more objectively are beyond the control of the individuals and teams concerned.

The question, as I frequently ask lecture and workshop participants, is; So What?

When you think about the decisions governments, boards and committees make for example, you don’t need to go too far to see the effect of ‘illusion of control’ playing out. That some policy or other actions can do things like reduce crime, increase educational attainment, solve market related issues and so on. This does not mean that I am not a believer in action, only that many actions we take and assume have solved whatever the problem is, have not in themselves been the solution. It has often rather been some other effect like regression fallacy etc.

There are a couple of interesting things here worth mentioning:

The first is that the cognitive bias we develop called the ‘illusion of control’ is frequently a response to ambiguity. Disambiguating something beyond our control appears to help emotionally. Ok it doesn’t lead to good decisions but we feel a whole lot better. A nice example of this is the difference between being passenger on a plane and the varying degrees of ill-ease felt say compared to the pilots who have a sense of control. A smaller effect can be felt often driving your own car or being a passenger in someone else’s. Yesterday I flew to Riyadh (where I am now) and was asleep when the plane hit a patch of really bad turbulence. I found myself sitting up and becoming alert, just in case. In case of what??? I found myself reasoning that we were 37,000 feet up flying at 550 MPH. If anything went wrong what was I going to do about it? Sod all really apart from probably scream and even then for what purpose? It just felt better to be alert and have the illusion of control even though in reality I had zero control over the situation. I was just trying to disambiguate the situation and (this is an important point) feel better – the emotional connection again. Once I realised what I was doing I relaxed, gave myself up to the uncertainty of the situation, stopped disambiguating and fell asleep!

The second is something called activity bias. More of which next blog. Oh and we will cover the recency effect as well!

Tuesday, February 12, 2008

The Regression Fallacy and desicion making / problem solving

Following on from the last article, the question was why do make decisions when we do?

The answer is that usually it is because we discern that there is a need for action. (well yup-de-do, I hear you cry).

The problem is that at the moment we realise that action is required the problem has almost always been around for some time and it has just pushed through some form of threshold to become noticed.
Now problems like pain, the stock market, organisational or individual performance or just about anything else, don't increase in a smooth incremental way (see last article), even if they appear to. They tend to fluctuate. So we notice the problem as the fluctuation crosses a threshold and makes it important to us. We tend not to concentrate on problems when they are below (and building up to) a threshold.
What happens is that 'on average' such fluctuating issues tend to regress to mean, or average out. We notice only the peaks as these have the largest emotional impact. We therefore tend to make decisions to do something at a peak once the problem has crossed a threshold (which can be emotional or psychological but are rarely consciously defined - it just feels like a problem now!).
Because of the regression to mean effect - (fluctuating events will almost always come back off the peak and move back to an average situation again, usually below the threshold) we think that whatever action we took, like going to the doctor, buying or selling, or changing the organisation in some way is responsible for the change, when in fact even without the decision things were going to even out anyway!

So we tend to make decisions to do things when they peak and assume that the action we took rectified the situation when the problems reduce, even though in all likelihood they were going to decrease anyway. The flaw in the logic that leads to this situation is to assume that the extraordinary events happening right now are now the 'norm' or average for this time and situation. This occurs usually because what is happening in the here and now feels like reality - and this feeds us into a place where we take now as a predictive indicator of the future - if things carry on like this...

Which brings us nicely onto another interesting decision making phenomenon. The recency effect - read all about it in the next article.

You may be interested in a couple of workshops I am running in London on 4th March. See here.

Saturday, February 09, 2008

How do we decide when to make a decision?

James is sitting browsing through the year old magazine and wondering what all the other people in the waiting room are here for. There is an old couple sitting next to each other talking quietly who smile politely when he catches their gaze, a young pretty girl in a short skirt who looks miserable and pale, workman in dirty work cloths holding his arm. He looks like he is in pain, probably an accident. Jams notices the pain in his stomach as it start to get sharp again. He first noticed that pain a few days ago and it hasn't really gone away. It eases off at times and at other times it is quite painful. He is worried, it might be an ulcer or cancer even, maybe he should have come to see the doctor sooner. He keeps pushing that thought to the back of his mind. He looks back to the old magazine.
Shelly is looking at the recent stock market prices. They have been a bit all over the place recently. She wants to invest but is this a good time? Some people are saying this is a good time to buy compared to last month and others are saying that in all likelihood the market prices will fall further in the coming months and maybe over the year. If only she knew what to do.
Arthur is trying to workout if this is a good time to start a full scale root and branch reorganisation of the business. Market conditions are tough and we certainly need to do things differently to boost the profitability of the company. If we do this now will it cause too much disruption at a time when everyone really needs to concentrate on the business and on making things more efficient or will it be just the tonic we need? Clear out the less profitable parts, get the bits working better that need to change and sort out the management structure. Hmmmm.

Why is it that people decide to do what they do when they do? For example why do people decide to start an organisational change programme when they do, or go to the doctor when they do, or buy or sell on the stock market and a million and one other decisions we make when we make them?
All of these things like most things in life fluctuate naturally. Organisational efficiency or effectiveness, pain or illnesses, the stock market fluctuate naturally. Very few if any events or processes have a smooth 'glide path' where the changes are wholly incremental and equally progressive. Fluctuations and variation are a natural part of any and every complex situation. So given that change is part of the system and that the rate and direction of the change is also variable and frequently defies prediction with any degree of certainty there comes a question. When we make a decision to make a change, like go to a doctor or sell / buy or do something to change things how do we decide when the best time is to do any of these things? What is it that prompts most of us to make a decision especially when things are uncertain?
In the next blog we will have a look at something called the Regression Fallacy and how it gives us the illusion of success and frequently wrong foots us.

Friday, February 08, 2008

The Representative Heuristic - Problem solved, well almost

Firstly the term ‘heuristic’ simply means a method (usually informal) that helps to solve a problem or put another way a method of disambiguation of an issue.

The Representative Heuristic was initially used by Kahneman & Tversky (1973) to describe a phenomenon they discovered whilst conducting research into how people make judgements when they are in ambiguous or uncertain territory. What they discovered is that frequently people will look for and find similarities between two events or objects and then make an assumption that the similarity they have discovered represents a rule and then apply that rule to create a solution. People look for a likely (to them) explanation to a problem based on similarities they think exist between a few bits of data and then make this similarity representative of that relationship, in other words it becomes a rule. However it doesn’t end there because when people do this they believe the representation that they have constructed assuming that it is correct.

You can try out a modern version of the original Kahneman & Tversky experiment here.

I came across a more common every day example last year:

I was conducting some research in a company and was working with a group of managers, one of whom was in the middle of recruiting for a post in his team. I was invited to observe some of the interviews. About 20 minutes before the next interview the manager received a call on his mobile. He listened for a second, said "thanks", closed the call and leapt up saying to me "come on, this is where I find out if it worth interviewing the next candidate". We both rushed to the window which overlooked the company cap park.

“There he is in the Silver car just coming through the barrier – let’s see what he does.”

The car then slowly navigated the car park looking for a space.

“We ask visitors to park over there” he said pointing at a few empty spaces marked ‘visitors only’. “Which is really handy as I can see what they are doing.”

The silver car moved to the spaces for visitors and drove straight into one of the available spaces. The car door then opened and a smartly dressed young man got out.

“At last, that’s the first one this week! Someone worth interviewing finally”

“Sorry?” I replied “I don’t understand.”

“Look” The manager explained “People who are focussed on work drive straight into their spaces. Everyone else has backed in this week. That means that they are more concerned with leaving than arriving, so I won’t hire them, no matter how good their CV is.”

Representative heuristics can be very useful in solving some problems. However because the logic used to construct a representative heuristic is often of the common sense variety they are frequently very misleading and plausible (I have had a number of people who I have told this story to say they hadn’t thought about it but now you mention it there must be something in it).

The other problem with this phenomenon is that it is also the basis for bigoted thinking. All women are, all blacks are, all homosexuals are, all engineers are... (Kahneman & Tversky )...and so on. One instance represents the similarities that the person has noticed (or more properly, constructed) and it is now considered to be true. Every time we now look at the situation having constructed the representation we keep noticing (filtering for) the same patterns of similarity and ignoring any differences.

Importantly it is worth noting is that this phenomenon is more likely to be used in new and ambiguous situations. Further in situations that evoke fear, like unwanted change and loss, the representations created are frequently to show just how negative this situation really is and to filter out positives and opportunities. That helps then!

Wednesday, February 06, 2008

The Monte Carlo (or Gambler's) Fallacy

There is an old joke about a man who always takes a bomb with him every time he takes a flight.
"The chances of an airplane having a bomb on it are very small," he reasons, "and certainly the chances of having two are almost none!"

It's interesting how we view risk. I mentioned in my last blog that perception of risk and the actual probability of an event happening are often (usually) not the same thing.

There is a phenomena called the Monte Carlo Fallacy, also known as the Gambler's Fallacy or phenomenon. Basically this is where there is some event that is truly or close to random , like the lottery or tossing a coin and predicting whether it will turn up heads or any other similar random event. The phenomenon is that often people will play such games and believe that every time they play and loose, each loss brings them one step closer to winning. So if we look first at a gambling example and then widen it out to more everyday events.
Take the lottery - any national lottery where you have to pick a series of numbers and if your numbers come up you win. In the UK the lotto works out at about 14 million to one against winning the big prize. What happens is that people keep playing the same (lucky) numbers every week in the belief that every time their numbers don't win they are one step closer to winning - next week maybe. The reality of course is that every week each selection of numbers still has a 1 in 14 million chance of winning and that stays that way no matter how long you play. There is no reduction in this chance what-so-ever as the numbers that won last week have exactly the same chance of coming up as any others do. It makes no difference what numbers you play you are still unlikely to win.
The same applies to other large and random(ish) events like air crashes for example. The fact that a plane has crashed already this week and on average only a few crash a year means that I am safe on this flight is nonsense. Ok, that's not much comfort if you are a nervous flyer but it is realistic at least. There are better indicators of air worthiness like maintenance schedules etc. however they don't fully account for the random chance events of a series of hitherto unknown issues coming together at some particular time. The stock market is another example. Past performance is not a guide to future performance. How many times have you heard that? It is true, however people still look to past trends to inform future decisions in situations that are random or as close to random as doesn't count. It is sort of hard wired into us.
This is the other side of risk aversion. As mentioned yesterday we are more likely to be risk averse if there is a perception of potential loss as opposed to a perception of potential gain.
Because things don't really happen in three's (Sorry!) and because what we believe is not always true, having an appreciation of the psychology behind risk behavour and thinking starts to help especially during events that are uncertain like organisational change for example, which is often a really good instance of ambiguity plastered over and made to look rational. We never truly know what the effects of re-engineering an organisation will be even if we believe we do. We can have a good guess but it is not guaranteed and it is a lot more ambiguous than most OD professionals would like to admit.
In the next blog I will explore what is called the 'representativeness heuristic' which sheds some light on why people engage in the Monte Carlo Fallacy and why risk averse and risk taking behaviour is an important issue, especially in times of uncertainty.

Tuesday, February 05, 2008

Risk aversion research

Whilst teaching at the Medical Sciences Division at Oxford University this week I came across a young D.Phil student conducting some very interesting research into risk and risk aversion in humans through the lens of medical perspectives of gambling addiction. Now clearly I am not going to pre-publish someone else's research, and especially not a student's, however the conversations we are having and the other research we have discussed are available to share and they throw an interesting light on risk averse behaviour, ambiguity and emotional resilience.
First I just want to reiterate a couple of things that I have mentioned before; Risk aversion is an emergent property of an individuals emotional reaction to a situation that is perceived to be ambiguous or uncertain, and that risk averse behaviour is usually different depending on whether the risk is considered to be risk of a gain or risk of a loss. Normally are more willing to take a risk if they believe there is a potential large win and a small loss. Which is why many more people will risk a few pounds or dollars on only a 14 million to 1 chance of winning the lottery (and almost certain to loose their money) without thinking and yet won't engage in stock ownership even though the likelihood of profiting is far greater in the latter scenario.
So most people have a natural tendency to avoid loss. This is that case whether the loss is financial, personal - like a job, role or position or social like a relationship, often suffering sever hardships rather than loose something like a bad relationship or a job they don't like.
The risk aversion in these cases are anticipatory, the loss hasn't actually happened and cold calculations of probability rarely affect the emotional reaction. (Which is why we often concentrate on emotional resilience in our workshops and coaching).
To be continued...

Monday, February 04, 2008

Risk aversion and the law of unintended consequences

The news that the UK Credit card company EGG is about to withdraw 161,000 credit cards from customers who are considered to be 'higher risk' is an interesting case study in risk aversion.
On one level their actions make a lot of sense for the company. If they are actually targeting individuals who propose a higher risk (and there is a question about how they have made this decision) in times of economic slowdown then restricting their ability to get into debit does reduce this companies exposure to risk later on - but only if other companies don't do the same thing.

In times of tougher money and in particular credit supply reducing peoples flexibility to move (and access) money around is very likely hasten the levels of bankruptcy. If you are in a tight spot and your emergency supply (the credit card) dries up and there is no way to get more money meaning you can't pay your debts then you will go bankrupt and then default on everything. This means that if every credit company, as is widely expected, follows suit, then this is quite likely to accelerate the numbers of payment defaults which is they very thing the strategy is trying to prevent.

This is a typical mode one (from the book the Ambiguity Advantage more of which later) risk averse reaction. When things look difficult more controls are put into place. Logically this appears to be the right thing to do. 'Things are going to be tight so we will reduce spending (or in this case the ability to spend) across the board'. That makes sense for the individual credit company. However if everyone does the same thing, the more they all control the money supply the less there is to spend, the less there is to spend the less people buy, the less people buy the less profit there is, the less profit there is... you get the picture.

So a risk reduction strategy that works for one company for a limited time, when copied and used widely is likely to actually bring about the very conditions they are trying to prevent.

This is also true within companies. Many companies that we have seen, cause themselves problems when things get tough by reducing spending / effort on the wrong things. So at the very moment when things need to change and employees need to think differently, get creative and find new ways of doing things you find that activities like better training and development, coaching etc, are usually slashed thereby exacerbating the situation.

Risk aversion often brings about the very thing we are trying to prevent.

Saturday, February 02, 2008

Leadership when people are scared III

When you are leading people who are scared here are some do's and don'ts:

  1. Learn to be become emotionally resilient. In a recent study we found that this is the single biggest factor in dealing well with difficult and ambiguous situations. This means that you can control the balance between your emotions and thinking whilst still keeping in touch with your instincts and remaining empathetic. More about this.
  2. Give people a sense of positive direction and movement. This really helps when people are unsure and emotions are running high.
  3. Communicate lots and keep in touch with people. One of the biggest failures, especially in situations of flux and change is not to keep people with you every step of the way. You should be acting as if you are on a mountain in bad weather with poor visibility. You rope everyone together and move as a group.
  4. Listen lots. All to frequently when the going gets tough , managers and leaders become autocratic, start telling and stop listening. History is full of such instances where leaders wouldn't listen to 'negative talk'. This is a big mistake. Listening to everything can notify you of so much like how people are thinking and feeling, dangers on the horizon are closer and more. The special forces (SAS etc) have what is called Chinese Parliaments. These are group meetings where everyone regardless of rank or status can and are expected have their say and put forward ideas, thoughts or questions including criticisms of leadership decisions and thinking. This form of tough love has lots of advantages.
  5. Keep people busy and thinking. The more they have to think the less time they have to feel scared. So let them solve problems rather than what most leaders tend to do is do the solving themselves (often poorly - see this). Make sure any tasks you give people are real tasks. people know when they are just being kept occupied with meaningless tasks. Navigating difficult and ambiguous times is a team effort.
  1. Panic! Keeping your emotions is vital now. If others think that you are in a negative emotional state and making knee jerk reactions rather than using the resources around you, they are unlikely to follow willingly. Additionally you may find that people start to loose respect for you.
  2. Make decisions in isolation no matter how good you think they are. Test them out and modify them in the light of different thinking from others.
  3. Surround yourself with comforting people who think like you do. This will just lead to group think. Find diversity in views and challenge. Now more than ever everything, and I mean everything needs to be challenged. Failure in difficult and uncertain situations usually come about because of the assumptions people make that go unchallenged. Paradoxically right now you need people who will confront, contest and question, which is the opposite often of what most leaders do in difficult situations which is to surround themselves with people who will agree and make them feel comfortable with their decisions. This is another reason why you need to be emotionally resilient.
  4. Shoot down daft ideas. Right now you need all the creative, imaginative and innovative ideas you can get. Treat one idea with contempt or anything less than being very welcome and you will kill creativity stone dead. Even the daftest (to you) ideas could well lead to an opportunity or the solution you need. Far too many leaders and their teams shut down prematurely on ideas and never find that killer concept. Nurture all ideas and see what flowers.
  5. Try to control things too much. The normal response to difficult times is to start to put in place tougher controls. This is often a very big mistake as it stops great and creative things happening, those happy accidents that get you out of the pile of poo you are in. These are known in complexity theory as emergent properties and they can only really occur when things are allowed to flow. More controls are frequently a sign of an emotional response to what is seen as a negative situation. Chill, look for opportunities and play!