How to be aware of cognitive bias in your school decision making processes

Dr Fiona Aubrey-Smith, director of One Life Learning, explains how your cognitive bias could be affecting your decision-making 

We are seeing a wonderful shift as education becomes an increasingly evidence-informed profession. Thanks to research and resources such as the Education Endowment Foundation’s Teaching and Learning Toolkit and Professor John Hattie’s Visible Learning Meta X, we are more informed than ever before about what works and what does not.

However, how often have you spent hours searching for solutions to problems that your school faces – and countless meetings and days working on something – only to find it has not led to the impact that you had hoped for? Despite access to robust and comprehensive evidence sources, we are still seeing inefficiencies within our decision-making. So why is this? It is all about cognitive bias. In other words – what is going on inside our own heads affecting how we look at the world around us. 

All of us look at the world through a specific lens which changes depending on the context we are in and who else is involved. As an example, let’s take something called ‘anecdotal fallacy’ – which is the tendency to take anecdotal information (e.g., other schools recommending a strategy or resource) at face value and give it the same status as more rigorous data when making judgements about effectiveness. We have all experienced this – hearing how successful something was across our cluster, federation, trust or local authority. We do not want to be the one not adopting something that clearly sounds like it works, so we buy-in and start doing it in our own school.

However, this misses the important step of asking why it worked elsewhere – or could this be like a runaway train, with other schools adopting it and not wanting to be seen as the one school where it did not work? Are the schools where the strategy worked similar enough to our own school to make a justifiable comparison about its likely impact? Most importantly, what does the evidence – impartial, objective, robust, longitudinal evidence – say?

Cognitive bias is an emerging field within education and an absolutely vital one to understand if we are to have greater impact – both as individual professionals and also through our collective efficacy – working together to have greater positive impact on children’s learning.

Get started

So here is an easy (and hopefully entertaining!) way to get started. Sit with your colleagues and talk through the types of bias in the list below. These are adapted from Hattie and Hamilton (2019) As good as gold? Why we focus on the wrong drivers in education and are provided here for you with practical illustrations and examples to help you to see which of these might be featuring in decision-making at your school. Be honest with each other and share examples where you think this has happened within your own school – perhaps collectively, perhaps individually, perhaps with specific colleagues. 

  • Authority bias – the tendency to attribute greater weight and accuracy to the opinions of an authority or well-known figure – irrespective of whether this is deserved – and to be influenced by it. For example, how many conference presenters or Twitter folk have you requoted without checking the accuracy of their claims? In order to tackle this, try not to be swayed by famous, titled gurus. Carefully unpack and test all of their assumptions – especially if they are making claims outside their specific area of expertise.
  • Confirmation bias – the tendency to collect and interpret information in a way that conforms with, rather than opposes, our existing beliefs. For example, how much of your monitoring confirms your existing predictions rather than viewing them objectively and critically? In order to tackle this, be prepared to go against the grain, and to question sacred assumptions. Remember, we tend to select education approaches, products and services that accord with our world view, and we will often continue to believe in them even when convincing evidence is presented that our world view may be distorted.
  • Ostrich effect:  – the tendency to avoid monitoring information that might give psychological discomfort. What is currently on your radar that you have been avoiding having to address? In order to tackle this collect robust and regular data from a range of sources about the implementation of new interventions and analyse this data ruthlessly – involve colleagues who have contrasting opinions to your own.
  • Anecdotal fallacy:  – the tendency to take anecdotal information – such as other schools recommending a strategy or resource – at face value and give it the same status as more rigorous data in making judgements about effectiveness. In order to tackle this probe deeply into why strategies, resources or products worked – where is the evidence, and is that evidence robust, impartial and objective?
  • Halo effect: – the tendency to generalise from limited experiences about an individual person, company or product – assuming everything they do/offer is just as good. In order to tackle this, remember that everyone has strengths and weaknesses. An expert in one area will not be expert in everything. For each area, search critically what the evidence shows works with greatest impact.
  • But our school is different: – the tendency to avoid using a tried and tested solution which evidence shows works because it was used or created elsewhere, claiming ‘but we are different here…’ In order to tackle this, remember that we have more in common than that which divides us. Do not reinvent the wheel – adapt or adopt what evidence shows works so that it then also works for your school.
  • The ‘IKEA effect’: – the tendency to have greater buy-in to a solution where the end-user is directly involved in building or localising the strategy, product or service. In order to tackle this, remember that we all feel greater ownership when we are involved in the creation or adaptation of something; channel this energy into personalising solutions which evidence shows works.
  • Jumping on the bandwagon: – the tendency to believe that something is good because a large number of other people believe it is good. In order to tackle this, simply remember that it might work or it might not. Ask those on the bandwagon to point you to robust evidence, and always check; what does the evidence actually say?
  • Cherry-picking: – the tendency to remember or overemphasise pockets of positive or negative data within larger sets of more random data (i.e. seeing phantom patterns). Why are you subconsciously doing that? To tackle this, look at trends over time, or trends across groups of children/schools. Do you have access to the whole dataset and does the rest of the data broadly agree with or support these trends?
  • Courtesy bias: – the tendency to give an opinion that is more socially palatable than our true beliefs (are you contributing to an echo chamber in your own school?) In order to tackle this, remember that human beings are never objective; we all look at things differently. Embrace this truth and use it as a way to open discussion, debate and robust conversation about what the evidence really says to all those involved.
  • Law of the instrument: – the tendency to only address problems for which you already have a potential solution. (Remember – if you have a hammer, everything looks like a nail.) In order to tackle this identify the problems that need to be solved, then search for solutions, rather than searching for problems to which you already have a solution.
  • Easy task blinkers: – the tendency to avoid complex projects, focusing instead on projects that are simple and easy to grasp by a majority of people. In order to tackle this divert the time and energy spent on all those little projects and focus this all on fewer, bigger, more impactful, projects. Go for quality strategies, not quantity of actions.
  • Sunken cost fallacy: – the tendency to continue with a project that is not bearing fruit, simply because so much has been invested in it already. In order to tackle this, for any investment – of time or money – ensure early milestones offer genuine review points that include the option to stop. Even if money has been committed, future time has not been – and this can be redirected somewhere else more impactful.

Once you have had a chance to do this reflect on the suggestions about how you might reduce or avoid such bias in the future. It’s important to be really honest with yourself, however uncomfortable this might at first. 

Only by recognising these biases, and addressing them, will we break free of the limitations they place on our professional practice – and on the impact we could have on our children’s learning.

Don’t forget to follow us on Twitter like us on Facebook or connect with us on LinkedIn!

Be the first to comment

Leave a Reply