EEF Blog: Whole-school change – how can we find out ‘what works’?

Our senior programme manager, Eleanor Stringer, discusses the challenges of evaluating whole-school improvement programmes.

On Friday we published our latest batch of independent evaluations of EEF-funded projects. These five reports included some promising results, notably a positive impact across English, maths and science from using Dialogic Teaching in primary schools.

It also included results from some of the earliest and most ambitious projects we funded: Challenge the Gap and Achieve Together. Both aimed to achieve whole-school change for schools in challenging circumstances:

  • Challenge the Gap, through externally-facilitated school-to-school support, led by Challenge Partners;
  • Achieve Together, through leadership development and school-improvement projects, led by a partnership of three charities, Teach First, Teaching Leaders, and Future Leaders. With funding from J.P. Morgan, we also piloted an area-based version of Achieve Together, working in disadvantaged schools in Bournemouth involving not only the three charities but other community players.

Both projects had high ambitions, but also had high expectations of schools. While some schools engaged well and found the projects useful, others found them resource-intensive and struggled to prioritise the school improvement projects into their work, particularly if there wasn’t leadership buy-in.

And while some participants thought attainment outcomes would improve in the longer term, they believed it would take a while for the changes to embed and change pupil attainment. This might help to explain why our evaluations, which looked at results on national attainment tests after two years, found no evidence of an impact compared to a selected group of schools which are statistically similar (they were both matched-controlled trials rather than randomised controlled trials (RCTs)).

These projects are very different to some of the most successful EEF-funded projects, which tend to be subject-focused or targeted interventions (such as ABRA Online Reading Support, or Thinking, Talking, Doing Science).

We could compare these results of whole-school projects with those, and decide to focus our efforts on more specific, targeted interventions. But is that what schools want? Both the Challenge the Gap and Achieve Together projects were trying to address key objectives for schools – how do we improve outcomes for disadvantaged pupils across all subjects? How do we improve teaching and learning across the school?

Instead, we’ve decided to get smarter about how we fund the delivery and evaluation of whole-school programmes. Challenge the Gap and Achieve Together were good studies, but, since commissioning them, the EEF has got better at designing strong ’implementation and process evaluation’ – the collection and analysis of data about how well the projects are implemented, and what factors affect this fidelity.

We’ve also used early findings about what good implementation in schools looks like, and how to ensure high fidelity, to inform the programmes we are trialling. And we’re improving our understanding of quasi-experimental evaluation, to make sure that when RCTs are unfeasible, as is sometimes the case for whole-school programmes, we’ve got the best available data on their impact. Finally, the EEF’s ’data archive’ – which stores all data collected directly from our evaluations – means we’ll be able to track the longer-term impact of these programmes.

Of course, it may be that, even with a greater focus on project implementation and fidelity and better longer-term evaluations, we discover that such programmes simply do not have an impact on school-level outcomes. But, whatever the outcomes, the trials the EEF is funding will generate valuable information for schools and providers about ’what works’ (or doesn’t), and why, to help develop our understanding of what is needed to bring about whole-school change.