Guest Blog: How our school leadership became more evidence informed
Roger Higgins, Director of Norwich Research school – part of the EEF's Research Schools Network – explains how his school’s leadership team is trying to become more evidence informed so that it can ‘walk the walk’ when supporting other schools.
At Norwich Research School we use research evidence to support other schools improve their teaching practice. This doesn’t mean we’re the finished article. Nor does it mean that we have one-size-fits-all ‘silver bullet’ solutions.
Being a Research School has seen us develop our own leadership behaviours to be more evidence informed. It’s been challenging, but being able to lead by example has been essential when working with our own staff or other schools.
So, what is ‘being evidence-informed’ anyway?
Profs. Coe and Kime provide some handy insights into the characteristics of evidence-informed schools. I’ve paraphrased some of them here, with a couple of additions of my own based on the work at our school on effective implementation.
An evidence-informed school:
- shifts priorities as the best available evidence changes;
- digs into the evidence beyond simple headlines such as ‘metacognition works’;
- develops ‘theories of change’ and ‘best bets’ as part of a careful, staged implementation process;
- prioritises evaluation to test if changes really work in your school context.
- triangulates professional judgement with multiple sources of data/evidence to make inferences.
- understands that effective implementation requires us to do less in order to do things better.
In order to be evidence informed at every level of our decision-making processes, we need to ensure that all of our professional development encourages these traits in our leaders. We also need to model them in our work as senior leaders.
Growing our leadership capacity
The platform for good school implementation is to create the right leadership environment and to carefully plan for implementation as a process, not as an event.
Reflecting on our own practice, we found that we did our best work as leaders when we worked in teams (which we termed ‘Implementation teams’). However intra-leadership collaboration was varied, with some working in isolation.
We also recognised that our leadership needed better support in designing and executing whole-school implementation initiatives; in part because we all implemented change in different ways and didn’t share a ‘common language of change’ to enable peer support.
Finally, we were potentially leading too much concurrent change in our school: our School Development Plan (SDP) contained 21 aims and was 11 pages in length.
Our first step was to train all senior leaders in implementation, including supporting them to draft an implementation plan for the forthcoming academic year. Our leadership team members with implementation expertise through the Research Schools Network then provided an ‘expert review’.
Finally, we overhauled our School Development Plan (SDP) so that it is now a summary of our senior leaders’ implementation plans so that they are clearly communicated to our key, stakeholders, primarily our staff and governors. Our updated SDP is now just two pages long, with seven clear aims, allowing us to focus on doing fewer things more effectively.
We have split Leadership meeting time into ‘operational’ and ‘strategic’ to ensure our plans are always at the forefront of our decision-making processes and remain responsive to change. Crucially, we regularly revisit our plans as a group in strategy meetings to monitor milestones, support and challenge each other and continue to develop our understanding of implementation. We have also resolved to pilot any school change, instead of diving right in.
Sustaining our school leadership developments
Prof. Becky Francis, the EEF’s chief executive, recently wrote that “surface-level compliance is the biggest threat to any change in education”.
The changes we are making to our leadership habits aren’t easy; however, we feel compelled to ‘change with the evidence’.
Treating implementation as a process not an event and seeking to answer the question ‘does it work, here?’, is how we believe our own school can best improve student outcomes.
With both process (‘how well’) and outcome (‘and so what?’) evaluation measures in place for each whole-school aim this year, we have a better chance of knowing whether that is true.
Our next steps are to better identify where different methodologies (eg, ‘matched groups’ and ‘controlled trials’) can be practically employed to improve the rigour of our student assessment and answer ‘did it work here?’ more confidently.
If you have are wondering whether to invest time training your staff on implementation and evaluation, consider these five key questions:
- How many pages long is your SDP? Does it remain a ‘live’ document?
- How many objectives / aims are listed within your SDP? Is this realistic in terms of your school’s finite capacity for change?
- Does your senior team work in ‘packs’ or as ‘islands’?
- How coherent does whole-school change look to your teachers and support staff?
- How confidently can you state the impact on student outcomes of changes you have made?