How beliefs obstruct organisational learning - Team Business
cognitive dissonance,confirmation bias,Rotherham social services,beliefs,organisational learning,
471
post-template-default,single,single-post,postid-471,single-format-standard,ajax_fade,page_not_loaded,,qode_grid_1300,footer_responsive_adv,qode-theme-ver-12.0,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.2,vc_responsive

How beliefs obstruct organisational learning

How beliefs obstruct organisational learning

Beliefs obstruct successful learning, problem solving, planning and decision-making.

Have you ever wondered why bad news seems to creep so slowly up company hierarchy? Or why too often senior management meets bad news with utter disbelief? Similarly, are you perplexed that managers regularly shoot the messenger? Or frustrated when managers treat whistle blowers as the problem not the issue that they are exposing? Well, these are all examples of how beliefs obstruct organisational learning.

This topic is important. Demonization of opponents, and resistance to change or uncomfortable facts perpetuate organisational problems. These familiar negative cognitive traits cause untold harm to both organisational effectiveness and organisational learning. They are in fact responsible for any number of classic business and public sector failures.

Fortunately we now have a better understanding as to why people fall for this irrational and destructive behaviour. With a better understanding there is some hope of adopting a solution.

Why beliefs obstruct organisational learning

The problem arises due to a peculiar brain mechanism. Rather bizarrely, where new information conflicts with our existing beliefs, our brains literally filter out evidence from our conscious perception. To put this another way, our brain is wired in a way so that we literally can’t see what we don’t already believe. When this deception occurs, we make, what the psychologists term, a ‘premature cognitive commitment’.

In other words we jump to a quick conclusion. The problem is that we base this new conclusion, not on available new evidence, but on some belief or pre-conception. Our snap conclusion is even more damaging if we base it on an emotional memory of some past associated event or situation.

An illustration from the animal world is well known in India. Traditionally, captive baby elephants are kept secure by tying one of their feet to a stake that is hammered into the ground. Now this method is sufficient to stop a baby elephant from running away. However, it is totally inadequate to stop a fully-grown one from uprooting the stake and ambling off. What is happening is that the young elephant develops a premature cognitive commitment to being unable to break free. This conditioning is so strong that as an adult it still ‘believes’ it the flimsy rope constrains its movement. The grown elephant’s past experience conditions it to see the stake and chain as a lot stronger than they actually are.

Cognitive dissonance and confirmation bias

As you can see, whenever we experience premature cognitive commitment we suffer from a serious self-limiting belief. The brain uses beliefs to help us make sense of a situation quickly. Unfortunately, sometimes this can be at the expense of really learning what is going on. The problem occurs because we are comfortable with our beliefs. Whereas the process of changing our beliefs, or being confronted with their inadequacy is acutely uncomfortable to us. In fact the discomfort triggers a stress response. This mental stress is known as ‘cognitive dissonance’.

The relevance to decision-making is clear. Where the apparent facts contradict entrenched beliefs, cognitive dissonance compels us to do one of two things. In effect we get the choice of either doing what is right or doing what is easy.

So, in other words, to reduce the pain of the dissonance we either 1. Change our belief to match the revealed facts or 2. We try and preserve our belief by conducting what has been termed ‘confirmation bias’ (sometimes known as ‘myside bias’). Unfortunately a lot of the time, as a ruse to escape the stress of having our beliefs challenged, we prefer the second approach.

Eight ‘confirmation bias’ tactics

Confirmation bias denotes how we subconsciously adopt subtle tactics that help us perpetuate our belief. This illusion is performed by blinding us to the truth.

Such tactics include:

1.    Different forms of misperception of what is going on
2.    A hunt for evidence that backs our existing beliefs
3.    Rejection, or refutation of the contradictory information
4.    Misinterpretation of the information to reinforce our beliefs
5.    Seeking support from others who share the beliefs,
6.    Attempts to persuade others that our beliefs are valid anyway
7.    False or selective recall – remembering only what we want to remember
8.    Attributing negative or ulterior motives to those advocating views that conflict with our beliefs

Premature cognitive commitment and its ugly protégé cognitive dissonance explain why bad news is often met with disbelief by managers. It also explains why the people at the top often find it difficult to accept negative feedback. Such feedback disrupts their settled worldview (belief) of how they are running things.

As a result of confirmation bias, negative feedback moves slowly up the hierarchy, if at all. The peculiarity of the brain’s filtering system also explains one other perennial phenomenon of dysfunctional hierarchies. Where confirmation bias kicks in, ‘whistle blowers’ offering differing opinions or expose the facts are often chastised or gagged.  In a healthy organisation, the very term whistle blower is an anathema. People giving negative feedback need to be embraced as saviours who provide the very means for the organisation to improve and move forward.

We need more whistle blowers

The tendency to shoot the messenger further reinforces senior management’s isolation from operational reality and further constricts ‘bounded rationality.

As we know, executives, like the rest of us, have cognitive limitations that restrict their view of what is going on in their own organisation. Unfortunately the existence of this ‘bounded rationality’ does not prevent managers from believing they know what’s going on and from acting accordingly.

This delusion is obviously dangerous.

Of course it does not help matters, where these beliefs have become self-serving. And in today’s corporations and public sector organisations, senior managers have the self-serving need to protect status and careers with six-figure salaries and pensions to match.

The higher up the organisation they go, it becomes depressingly more difficult for bosses to acknowledge that their beliefs might be at odds with reality. The scale of their adverse reaction is directly proportionate to the amount of status, power, money and reputation under threat.

Public sector organisations lack a Darwinian balancing mechanism to penalize confirmation bias

In private enterprises, there are balancing factors that limit the impact of confirmation bias. Market forces such as an alert competition, fickle customers or investors voting with their feet inject a dose of reality. Such factors either jolt management out of its Alice in Wonderland world or instead drive the company into bankruptcy.

However, public sector organisations and large state sponsored charities tend not to have this Darwinian balancing mechanism. Delusion and therefore failure is perpetuated by a weakness in accountability either from below or above. The consequences, as we have seen from a string of public sector projects can be disappointing performance at best. At worst this lack of accountability leads to catastrophic failure.

The Rotherham child protection services failure is a case study in confirmation bias

A chilling example of catastrophe, occurred at Rotherham Metropolitan Borough Council in the UK. In this, abysmal case of almost inconceivable incompetence the local social services and allied agencies failed to protect over a thousand children from ‘industrial scale’ brutality. Much of the abuse was perpetrated by predatory gangs of Pakistani sex abusers.

Professor Alexis Jay, who wrote a damning independent report on the scandal, (incidentally this report was initially met with disbelief by the council) asserts that, “nobody could say ‘I didn’t know’.”

Unfortunately given what we now know about ‘bounded rationality’ and the associated problems of premature cognitive commitment and cognitive dissonance, that is probably what happened. The people at the top subconsciously refused to see anything that conflicted with their existing beliefs.

The Rotherham case shows how beliefs embedded in the culture of an organisation can attract the same drawbacks as beliefs held by individuals. This is in terms of the beliefs impact on rational thinking. In the Rotherham instance, like a lot of public sector authorities, just one of the prevalent beliefs is Multiculturalism. The way this manifested in Rotherham was that there was an institutional disbelief that one ethnic group was committing the predatory sex crimes. There were a couple of other institutional beliefs that got in the way of the facts.

Institutional beliefs get in the way of the facts

One was the belief that the ‘sorts of girls’ who were suffering abuse were ‘asking for it’ anyway. (According to Simon Danczuk MP writing about the scandal in a Times article, the director of children’s services implied to him that young girls who were being raped were ‘making lifestyle choices’.) Another entrenched belief was the typically self-serving one that government-run child protection agencies do a better job of protecting children than their parents can. (The Police actually arrested some parents for trying to protect their children from harm).

The use of the phrase ‘self-serving’ may seem harsh here, but nevertheless it is fully justified. The purpose of the child protection services is to protect children. Believing this to be the case gives important meaning to the work being done by the teams of social workers and their managers. And meaning, is an important motivational human need. Where the evidence on the ground suggests that this protection is far from working, then this important sense of meaning is violated.

Having meaningful work meets a powerful emotional need

Violating an emotional human need, as we know, can quickly trigger a powerful stress response. In other words the sort of cognitive dissonance provoked by this type of situation where the evidence conflicts directly with both the subject’s belief and sense of meaning is extremely uncomfortable. The experiencer faces evidence that undermines the very purpose of their work. It also challenges their belief in their own competence (a feeling of competence also being an emotional need) and perhaps the validity of their organisation. As a result, the reptilian survival mechanism kicks in. At that point it becomes far easier, quicker and simpler to ignore the evidence, carry on as usual.

The next tactic is to try and persuade everybody that everything is OK (confirmation bias tactic number six, as seen above). This is by way of an explanation not an excuse.

As you can no doubt imagine, with these sorts of ideologies embedded in Rotherham’s child protection system, it was chronically unable to see what would have been blindingly obvious to any objective observer in their rational mind – Professor Jay, for one. (Another observer who managed to retain her rational mind was the ‘whistle blower’ Jayne Senior. Jayne was a youth service manager who passed on 200 files to the Times in a successful attempt to expose the horrors going on). Typically, the response from the local council was to seek a criminal inquiry into the identity of the whistle blower. Again, cognitive dissonance prevented them taking a far healthier approach. Embracing the information as invaluable bottom-up feedback would have been far more productive.

When an ideology takes root, the learning stops and executives behave like corporate zombies

The key lesson is that when an ideology takes root, the learning stops. Instead, executives behave like corporate zombies refusing to see what they don’t already believe in. As a result, institutional ideologies make life very difficult when it comes to adapting and refining procedures and processes in response to new information or dynamically changing circumstances.

Premature cognitive commitment also explains why in the political arena, ideological parties and regimes tend to be repressive and obstruct social and economic progress. The fervent belief in their respective ‘isms’ prevents adherents from processing adverse information that contradicts their belief. Instead their ignorance compels them to embark on an evangelical crusade to persuade others to follow the same beliefs. Other less than useful results include dogged resistance to change, the demonisation of opponents, hysterical opposition to alternative ideas and solutions and the persistent advocacy of anachronistic and otherwise highly inappropriate solutions that tend to perpetuate the problems not ease them; all this to escape the pain of adapting to new ideas and new evidence.

The good news is that sooner or later ideologically driven causes, movements, parties, governments and institutions are prone to self-destruct. The bad news is that before they go down, they can do untold harm and damage to their stakeholders and anyone else who gets in the way.

Self-belief is an obstacle to self-awareness and learning.

Unfortunately this belief mechanism works for self-beliefs as well. Our beliefs about ourselves get in the way of seeing the reality about ourselves. From childhood on-wards we cultivate self-beliefs. These help us come to terms with the complex and sometimes, chaotic or traumatic world we lived in. Unfortunately as life goes on and circumstances change, these beliefs form emotional impressions in the mind. It is these emotional impressions that are responsible for filtering out evidence that contradicts them.

I worked with a business owner once who liked to believe she took a genuine, almost maternal interest in her employees’ welfare. In truth she was very personable and charming and often showed attentiveness to her employees’ concerns, their families and so on. However the underlying reality was that her business didn’t make a lot of money.

Significantly, the money it did make was largely due to the low wages she paid.

As it happened, the lady was particularly adept at getting people to do a lot of free overtime. In fact, the average overtime was about 20 hours a month. Although some key staff did a lot more. Now with 100 employees each doing an average 20 hours free overtime you can do the sums. In effect she was neatly adding about £200,000 or so of free work to the bottom line profit. This made life quite comfortable for her financially.

A comfort zone reinforces confirmation bias

Sitting in her comfort zone, making a tidy profit, the owner failed to take responsibility to study the systemic reasons for her organisation’s poor performance. What she really needed to do was carry out rigorous research and inquiry to find ways to make it easier for her people to do their work. This way efficiency would improve and so profits could rise, but not at the expense of wages. Instead the owner preferred to focus her energies on team building exercises, and her ‘coaching style’ of leadership. These latter activities appealed a lot more to her self-belief about being a compassionate and caring team leader.

Essentially the owner was very good at getting the staff to hold the company together by their excessive input of energy and goodwill and she lived well off their hard work and low pay. Ironically, although generally employees said they liked the owner, there was at the same time, a steady drain of the longer term, experienced employees. Most of the departures were due to burnout and stress. This attrition of long-term experience had a further deleterious impact on the efficiency of the system.

So, in other words the reality of the working environment was the very opposite of her self-belief of being a ‘nice boss’. Drawing her attention to this fact triggered cognitive dissonance and led to a strong emotional arousal. In this instance expressed as self-righteous indignation. People don’t like their self-beliefs shaken; it frightens them.

Senior management’s self-confidence can be a costly delusion

The type of person occupying the higher reaches of a large organisation arrives at this level due in large part to a high degree of self-confidence. Research shows that this self-belief in their ability to manage does not necessarily match reality. There is an unsettling implication here for organisational design. And that is that as status rises with promotion, so this self-belief becomes more and more ingrained. Thus, once a manager achieves high office, it becomes even more difficult for inputs of contrary information to penetrate and alter their cherished self-beliefs.

This factor is never clearer than when attempting performance improvements.

I often find that where an executive has cultivated a belief about their own high level of competence, it is very difficult for them to accept that, lagging organisational performance may be due to inadequacies in the system. They after all ‘own’ the system and as they believe they are so competent, it follows that the system must be OK. It is far easier and less painful to believe that failures are due to someone, somewhere ‘screwing up’. It is this delusion that provokes a search for the culprits rather than a hard look at the weaknesses of their own creation. Unfortunately, the whole ‘witch-hunt’ process creates more organisational stress. The added stress further damages the organisation’s ability to operate effectively.

This type of emotional and irrational denial flies in the face of ‘systems thinking’. A key principle in systems thinking is that 95% of variations in performance are due to the system. On the other hand, only 5% of variations in quality are due to the people. The search for someone to blame misses huge opportunities for renewal and improvement. Blame games and witch hunts simply perpetuate a cycle of decline and low productivity.

Confirmation bias is not inevitable

As you can hopefully see, confirmation bias is highly destructive of organisational learning. But the good news is that we can remedy it. Our psycho-physiologies do not condemn us to automatic crass stupidity when confronted with radical new ideas. Similarly, we don’t have to yield to the instinctive stress response when faced with changes that contradict our existing beliefs.

There is no space in this article to go into all the various remedies available. Suffice to say that the remedies fall into three categories of ‘self help’, ‘third party help’ and ‘leadership intervention’. Appropriate leadership intervention can be particularly useful, especially where it involves group collaboration. Apart from other advantages, collaborative working tends to reduce stress levels among the participants.

Structured planning helps reduce stress levels and improves access to information

As these problems are so widespread, it is highly advantageous for leaders to avoid rushing the planning process. Instead it is best to take the time and trouble to go through a structured team-based approach to decision-making. Such an approach enables the deliberate development of a decision through a sequence of seven steps. These steps involve research, root cause analysis, ongoing consultation with relevant stakeholders and a balanced evaluation of alternative solutions.

One of the benefits of this approach is that it helps to ensure that we make decisions in context. This way we can avoid creating unintended consequences. Another benefit is that the collaborative approach comprises a powerful aid to reducing personal stress levels. The lower stress levels help override the default mechanism of stressed thinking provoked by cognitive dissonance. In ‘Reinventing management thinking’ I outline eight ways this sort of team planning reduces stress and confirmation bias. But training managers to develop a working understanding of ‘systems thinking’ also helps them in other ways. A comprehension of systems thinking helps overcome the prejudice that the people are at fault. Managers find it more easy to accept the reality that it is nearly always the system that is failing.

This article comes from chapter 11 of “Reinventing management thinking – Using science to liberate the human spirit” by Jeremy Old, now available from Amazon

 

No Comments

Post A Comment