Education Endowment Foundation:What can we learn from implementation and process evaluations?

What can we learn from implementation and process evaluations?

What can be learned from the implementation and process evaluations of COVID-affected EEF trials?
Author
Kim Bohling
Kim Bohling
Head of Evaluation

Our Head of Evaluation, Kim Bohling, explains what can be learned from the evaluations of COVID-affected EEF trials.

Blog •3 minutes •

The impacts of the pandemic on education have been far-reaching and hugely disruptive. Over the past two years, partial school closures forced teachers to completely change their approach, delivering learning remotely for most pupils. A side effect of this has been on educational research and EEF funded trials: the delivery of many programmes being tested has been interrupted, which has presented real challenges for measuring their effectiveness.

Difficult decisions have been made about how we ought to proceed with evaluations affected by COVID-19, in order to limit strain on school staff whilst also presenting useful findings for the sector.

Today we’ve published our latest batch of evaluation reports, many of which have been affected by the pandemic meaning we’ve been unable to make robust impact estimates. However, we’ve still been able to publish the implementation and process evaluations (IPEs), which contain valuable insights.

So, what can we learn from this type of evaluation?

What is an IPE?

People tend to be more familiar with an impact evaluation, which explains the effect a programme has had on its recipients’ academic progress. An implementation and process evaluation is a complementary research approach that explains the process of how a programme was delivered and attempts to identify elements of successful (and unsuccessful) implementation (Humphrey et al., 2016). IPEs can be designed to answer a variety of research questions, and draw upon diverse data sources including interviews, observations, and programme implementation data.

How can the findings from an IPE be used?

There are a number of ways that IPE findings can be used, including to help us understand how a programme is actually being implemented in schools, and making improvements that might support better implementation in future. A few examples are provided below, highlighting some of the useful findings drawn from these interrupted trials.

1. To understand how and why impact was/wasn’t achieved 

If a programme is found to be effective in improving pupil outcomes, the IPE can help us to understand how this was achieved, which is crucial for being able to replicate the programme – and ideally to produce the same positive impact again. Conversely, if a programme was not found to have an impact, an IPE may provide some evidence as to why, which can help a programme developer identify where improvements may be needed.

In the Flash Marking trial, we were not able to collect pupil outcomes to understand the programme’s impact on attainment (the primary outcome). However, the impact evaluation was also designed to look at the effect on teacher workload (a secondary outcome). Among the teachers who completed workload surveys before and after the intervention, evaluators found that those in schools receiving the intervention reported a greater reduction in time spent marking than control group schools.

IPE findings aligned with this impact finding. Teachers who participated in interviews generally felt the programme did reduce their hours worked, particularly when it came to time spent on marking and feedback. Teachers acknowledged that there was some necessary initial time investment to embed the programme, but once this was completed, then less time was required of them for providing feedback. In schools where the programme was more securely embedded, the evaluators found that teachers were typically able to observe some reduction in the workload. Where schools were using the programme alongside other tools, some teachers did not feel workload was reduced and may have even increased.

2. To identify where a programme is working well, and where improvements might be needed


Many IPEs include research questions that explore barriers and facilitators’ to programme implementation. The data collected to answer these questions can help programme developers identify implementation elements that support programme delivery (e.g., strong headteacher support for a programme) and elements that hinder delivery (e.g., a time consuming training programme).

In the Tips by Text trial, the programme provided regular tips via text messages to parents to support developmental activities at home. The evaluators found that one key facilitator to the intervention’s implementation was that it posed little burden on schools. Schools only needed to provide parent contact information during the set-up phase, and the programme team managed all other aspects of delivery. The trial also explored reasons why parents might not have used the tips and found a variety of reasons including that some parents were already doing the suggested activities and others felt the activities weren’t the right level for their child (too easy or too difficult).

This is just a brief snapshot of some of the findings across the evaluations. Despite the lack of an impact evaluation in some of the COVID-19 affected trials, the IPE findings have provided rich data and useful findings that can inform programme development and implementation moving forward.