The Impact of Beliefs and Assumptions on Decision Making
Author: James B. Rieley. Volume 1. Issue 1. June 21, 2012.
Abstract: This paper presents a case study on how different beliefs and assumptions produce different interpretations and actions in response to the same situation. Argyris and Schon’s framework for learning and ladders of inference provide the theoretical foundation.
Our beliefs and assumptions play a huge part in the way we make decisions. The reason for this can be found in a phrase that has become almost a mantra for systemic thinkers. The phrase, “We don’t believe the world we see; we see the world we believe” can shed light on why decisions often do not make sense to others, and in addition, have an impact on our willingness to learn. The whole learning thing can be a bit confusing. We, as a species, do seem to have ‘learning’ hard-wired into our DNA. We love to learn and the first years of our lives are clear demonstrations of this. We learn to talk. We learn to walk. No one teaches us how to do these things. No one stands in front of a flipchart and makes a diagram of vocal chords and airflow, nor do they make little stick-figures that explains the inter-relationship between balance and muscle adjustments that make walking possible. We just learn to do these things…because we want to learn how to do them. And whilst learning does seem to be part of our very being, for many of us, this appears to be the case on a selective basis.
Part of learning, especially as we get older, is being willing to unlearn some of the things we already know. What we already know becomes validated as our ‘truths’ through a process in which the beliefs we have cause us to continually look for data that supports our beliefs. This becomes a reinforcing dynamic; i.e. the stronger our beliefs, the more we look for supporting evidence that our beliefs are the right beliefs, which then makes them stronger, causing us to look…etc. But there are some situations in which it is apparent that our beliefs may not reflect the reality of a given situation. This can typically cause one of two things to happen. Either the one holding the beliefs becomes defensive and thinks that he (or she) is the only smart person; or, the one holding the beliefs decides that perhaps it might be good to explore how his (or her) beliefs took them to where they are.
In the case of decision-making, the person who believes that he is the only one that understands what is going on, when discovering (or being told) that his decisions were not giving the results that were expected, often drops into defensive-thinking mode and, as identified by Chris Argyris (1991) in “Teaching Smart People How to Learn,” “slide into defensive thinking mode in which they mentally construct defensive routines that explain why what happened wasn’t their fault. This defensive mode thinking manifests itself in placing blame on others and avoiding any critical examination of how their actions or decisions may have resulted in the outcomes that were seen.” When this happens, the ability to learn ceases due to the unwillingness of the subject to even explore that his beliefs could possibly be based on an unstable foundation. The other scenario – being open to exploring how our beliefs are established and their impact on our decision-making – is a viable option, especially in organisations that recognise that the ability to learn faster than the competition can be the real differentiator for sustainable organisational success.
An appropriate question might be, how can we find out more about how our beliefs are impacting our decision-making? In the mid-1900’s, a group was put together to explore this and their methods were highly successful. The STOL group (Systems Thinking and Organisational Learning) began as an ad-hoc group at a college that had been going through some pretty tough times. The Head of the College had been sacked and replaced by someone who had a relatively hard-core management style. (note: ‘relatively hard-core management style may not quite capture what employees saw when the new man came…it was, paraphrasing many of the employees, much like Genghis Khan had become the new college President). The STOL group, without any formal support, decided that in order to learn more about systems thinking, they would use the current situation at the institution as a case study.
After a relatively short period of time, the group had identified the dynamics at play and after having their work validated by a large spectrum of non-group member employees, decided that they should show their work to the new President. The presentation was actually a set of causal loops on a single sheet of paper and after some explanations of how to read the loop structure, the President ‘got it.’ But, as with what happens in many environments, he did not like it at all.
What the loops showed was that whilst the new President was making headway in resolving many of the financial problems that had been plaguing the institution, his management style was resulting in a sinking organisational climate. In order to turn this around, the President did adapt his style, but the results of his learning were not visible. The STOL group took this an another opportunity to continue to learn and set about exploring what was behind the evidence that regardless of what the President did, and how he did it, there was still the belief that his style was hard-core and debilitating management.
As the de-facto leader of the STOL group was a direct report to the President and often was instrumental in writing his speeches, it was determined that the STOL group would use this connection to try to understand the size of the apparent disconnect between what was being said by the President, and what was being heard by employees. At the next management team meeting (the college had an extended management team of 120+ people), as soon as the President had completed his talk, two dozen managers were brought together and each given a piece of paper that had been divided into two columns. In the right-hand column, key points from the President’s talk were listed, and the left-hand column was completely blank. The managers were asked to read each of the talk key points, and then in the left-hand column, write down what they ‘heard’ when the President said those words. The key is the ‘what they heard,’ and in this case, the STOL group wasn’t after what words were heard – they had access to a copy of the talk before it was made, so they knew the words. What they were after was more of the subliminal ‘heard,’ the ‘what went through your mind when you heard those words spoken.’ What the group found when the exercise was over was quite revealing.
Right-hand column text such as ‘’we have been able to sort through quite a bit of the debt that the previous administration had collected, but we still have some distance to go in order to be able to resume investing for the future” generated left-hand column comments such as “if it is getting better then why are you still cutting my budget;” and, “great we are getting through this mess;” and, “you s**t, you will use this as an excuse to not let us have a rise in salary during next month’s contract negotiations.” Another right-hand column text excerpt was “neither (name of Executive Vice President) nor I have all the answers, but if we all work together, (name of college) will once again be recognised for helping our students, faculty and staff realise their potential.” Left-hand column comments for this included, “ha. You admit you aren’t that smart;” and, “wow, that must have been pretty difficult for him to say. Maybe he really is changing and maybe we should give him the benefit of the doubt’” and, “he is up to something and we have to keep on our guard.” Members of the STOL group had believed that the majority of managers weren’t really ‘hearing’ what the president was saying, and now they had real evidence that this was going on. As part of their own learning process, STOL group members then went on to explore why it is that when hearing (or reading or seeing or any of the senses) the same thing, two people may come to different conclusions and, consequently, take different actions. For this they used the Ladder of Inference.
The Ladder of Inference had been identified by Argyris (1990) in his book Overcoming Organisational Defenses as the metaphorical way in which we go step by step from being exposed to data of some sort, all the way to taking action. Argyris identified seven ‘rungs’ of his ladder, each one building on the previous one. At the bottom of the ladder was the first rung in which we hear, read, see, something as a part of the ‘data’ that we are always surrounded by. In the second rung of the ladder, we select one piece of the available data to focus on. On rung three we begin to add meaning about the data we have selected, followed on rung four by making assumptions about what those meanings are and what they represent. On rung five of the ladder, we draw conclusions about the assumptions we have made. On the next rung, we adopt beliefs based on the conclusions we have drawn (which were gathered from the assumptions we had made, which were made based on the meanings we determined on the data we had selected to focus on). It is after we adopt beliefs that we actually take action; i.e. do something. Whilst Argyris’ Ladder of Inference is fascinating, what makes it so powerful are two things. Argyris identified that the beliefs that we adopt set up a reinforcing dynamic with the data we have selected. This dynamic causes us to continually select data that supports what our beliefs are, therefore, creating a situation in which it is difficult for you to question your own mental models (beliefs and assumptions of what is right or wrong) because you avoid even considering that they might be not valid. Argyris also stated that he believed that the time it took to go up your ‘ladder’ from being aware of the data surrounding you to taking some type of action was about the same amount of time it takes to snap your fingers.
The STOL group thought that if they could get some of the people who had participated in the left-hand column exercise to be willing to explore why they ‘heard’ what they did, this could be a powerful learning exercise for them, as well as an opportunity to find a way to help build greater alignment in the institution. What was fascinating was the fact that the case subjects for this effort came from two different mind-sets; those who had responded to the left-hand column with more or less what the President had said during his talk, and those who heard the same words, but came to completely different conclusions based on them.
The two sample Ladder of Inferences show how different people, sitting in the same presentation, hearing the same words spoken with the same inflection, can lead to two completely different sets of actions. With this evidence of what can happen, the STOL group presented their findings along with a strong recommendation that if the institution was to be able to move forward, it must make what they discovered a formalised topic of all management team agendas. Their rationale being that the only way to take the conversations that were going on at the coffee machine and bring them into the organisation so the institution would have a chance at increasing its level of alignment and institutional climate.
There were some strong lessons learnt in this process, and these lessons could be critically important for anyone who wishes to try to replicate this story in their respective organisations.
- Before attempting to do this (or something like this), it is important for the person leading the group to have a solid knowledge of the current politics in an organisation. Either you will need to have a solid relationship with the CEO in which both he (or she) and you understand that whatever is put forward as a learning experience is not personal but in fact, an activity that could improve both an organisational climate and organisational performance.
- The person (or persons) leading this type of effort need to have a high level of confidence and an equally high level of knowledge of systems thinking and systems thinking tools. This includes not only how to use these tools, but also what some of the unintended consequences of using them could be.
- An understanding that influencing mental models is an extremely powerful way to change demonstrated behaviours.
- The knowledge that nothing is forever. In the case of this story, after the head of the STOL group left the organisation, the influencing dynamics changed and the STOL group and much that they accomplish slipped away.
James B. Rieley is an advisor to CEO’s and senior leadership teams from all sectors. He was the CEO of a successful manufacturing company for over 20 years, and has written extensively on the subject of personal and collective organisational effectiveness. He is the author of Gaming the System (FT/Prentice Hall), Leadership (Hodder), Strategy and Performance (Hodder), Change and Crisis Management (Hodder), as well as numerous articles and the subscription-based Plain Talk about Business Performance newsletter. His work has been cited in Fast Company, Making It Happen: Stories from Inside the New Workplace, A Fieldguide for Focused Planning, and Breakthrough Leadership. Rieley, who holds an earned Ph.D. in Organisational Effectiveness, lives in Mallorca, and can be contacted at firstname.lastname@example.org.
Staff who worked on this paper: Editor: Barry Clemson. Reviewers: Gene Bellinger, Nicolas Stampf, Beth Robinson, Richard Wright, Anne Maguire
Argyris, Chris. (1990). Overcoming Organisational Defenses. Prentice Hall.
Argyris, Chris. (1991). Teaching Smart People How to Learn. Harvard Business Review 14 pages. May 01.
Citation details for this article
Rieley, James. (2012). The Impact of beliefs and Assumptions on Decision Making. Systems Thinking World Journal: Reflection in Action. [Online Journal]. Vol. 1 Issue 1. [Referred 2012-06-21]. Available:http://stwj.systemswiki.org . ISSN-L 2242-8577 ISSN 2242-8577