Education Endowment Foundation:EEF Blog: The EEF’s approach to COVID-19 impacted evaluations

EEF Blog: The EEF’s approach to COVID-19 impacted evaluations

Author
Ellen Smith
Ellen Smith
Programme Support Officer

Our programme support officer, Ellen Smith, explains the impact of partial school closures on EEF-funded trials and the principles that have guided decisions made in response.

Blog •3 minutes •

At the core of the EEF is our mission to raise attainment for disadvantaged students. To find out what works in closing the gap, we work closely with EEF grantees and independent evaluators to ensure that evaluations of interventions are reliable and rigorous

With a portfolio of 76 active projects ongoing across England, covering themes such as the early years, social and emotional learning, feedback, science, and digital technology, amongst others, this requires tight-knit planning and communication across all parties involved.

It has never been more important to learn from the what has or hasn’t worked in other classrooms…

Unfortunately, COVID-19 and its subsequent impact on schools, early years settings, and further education providers, has had a knock-on effect on a large number of our evaluations. The several waves of partial school closures that we’ve seen, accompanied by the immense challenges that schools have faced in the last year, have posed sizable problems for many of our projects which, in some cases, have forced their discontinuation or curtailment.

Across these projects, we have attempted to maintain a consistent approach to decisions. For the benefit of other research funders, and all those interested in our evaluations, here we share the 5 principles that have guided our decision-making process on what to do with our projects; whether to continue as planned despite the challenges, modify and adapt, or take the unfortunate decision to discontinue.

1. Do no harm to schools

Whilst evidence generated from trials is, of course, beneficial for schools – in helping them to understand which interventions are effective, and ultimately to help them embed evidence-led practice – we recognise that participating in evaluations may be very challenging at a time when schools are facing so many difficulties. We have, therefore, endeavoured to minimise any further burden to schools by limiting additional data collection or trial-related communications.

2. Generalisability of results

When deciding whether to proceed with an evaluation in the same or an adapted way, we must also think about how reliable the results will be in assessing the efficacy or effectiveness of an intervention. We have carefully considered how we would interpret the results of any COVID impacted evaluation, and whether these results would remain informative to schools given that delivery has been undertaken in such an unusual context. Results may not be generalisable to a post-pandemic era, for example if an intervention was delivered in a way it wouldn’t ordinarily be, and we have carefully considered this across projects.

3. Feasibility

We have, of course, also had to balance whether it is even possible to continue delivering or evaluating a project due to the COVID restrictions placed on us all in the last year. These may make certain evaluation tasks, such as in-school testing, extremely difficult to undertake, while some interventions may be very challenging to deliver in a COVID-context. The cancellation of some national assessments has also made assessing the impact of some of our projects very difficult. Therefore, unfortunately in some cases, continuing with safe delivery or evaluation has proven unfeasible.

4. Fulfil promises to schools, parents and pupils

Where schools, parents and pupils have been promised an intervention, and where it is still feasible to deliver and the desire from schools remains, we have endeavoured to continue delivery as best we can. Everyone in education has faced huge challenges in the past year and, if possible, the EEF is determined to provide the interventions offered where schools and parents wish to continue, even if the initially planned evaluation of it is no longer feasible.

5. Case-by-case

First and foremost, our approach has not been a one size fits all’ approach. With trials at different stages and involving different year groups and education settings, the EEF has convened meetings with each delivery and evaluation team involved in a COVID impacted evaluation to consider the best way forward on a case-by-case basis.

After consideration of these principles, many projects have been discontinued or modified. For instance, some evaluations will now focus more on the qualitative elements of the evaluation rather than the quantitative estimates of impact we had hoped to capture. In others, we may have repurposed some of the research questions to learn what we can in the current context

However, in all cases, the EEF will do what we can to learn lessons from each and will publish available findings. With research indicating a further widening of the attainment gap for disadvantaged students over the pandemic, it has never been more important to learn from the what has or hasn’t worked in other classrooms, and make evidence-informed decisions in response to large challenges.

For information about specific evaluations, please visit the project’s EEF webpage where you will find a summary of the impact of COVID on the evaluation. If you are taking part in an EEF evaluation, the relevant delivery team or evaluator will be in touch to inform you of any changes.