EEF Research Papers
The EEF Research Paper Series is a collection of open access research papers relevant to EEF evaluations and the wider research community.
The papers are not externally peer-reviewed. They are published electronically and are freely available online or through email distribution.
For each paper, we publish an 'EEF response', briefly setting out how we are using these findings to inform our future work.
Authors: Ben Smith, Stephen Morris and Harry Armitage
This guidance is intended for the planning stage of trials – in particular, trials for which the primary outcome measure being considered is attainment at GCSE (and in particular, GCSE English Language and Mathematics).
Authors: Ben Smith, Stephen Morris and Harry Armitage
This paper aims to assess the impact of using GCSE grades as a primary outcome in educational evaluations and trials, compared to using marks.
Authors: Sam Sims, Jake Anders, and Laura Zieger
This report focuses on whether one particular non-experimental method can reproduce the results from experimental evaluations: the comparative interrupted time series (CITS) design. The basic idea is to compare the way in which outcomes in the treatment group deviate from trend after an intervention is introduced, relative to the way in which outcomes in the control group deviate from trend at the same point in time. Under certain assumptions, the difference between these deviations can be interpreted as the effect of the intervention.
Authors: Matt Walker, Julie Nelson, Sally Bradshaw, Chris Brown
Research Brief May 2019
This research briefing summarises findings from a nationally representative survey of schools and teachers, which investigated teachers’ research use. The survey was designed with reference to the principles adopted in an earlier 2014 study in which the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF) developed a Research Use Survey (RUS) (Nelson et al., 2017). The new survey took the most effective elements of the RUS, and augmented these with recent knowledge about research engagement and use. The findings are based on survey results from 1,670 teachers in England. The survey was administered between 19 September and 12 November 2017.
Authors: Rebecca Allen, John Jerrim, Meenakshi Parameshwaran, Dave Thompson
EEF Research Paper Series, No. 001, February 2018
EEF response: This study analyses some of the properties of major commercial assessments used by the EEF in large trials. We have used the results of this study to update some of the assumptions fed into sample size calculations and inform the choice of appropriate and cost-effective commercial assessments to be used as pre-tests.
Authors: Peter Tymms, Adetayo Kasim
EEF Research Paper Series, No. 002, February 2018
EEF response: In line with its goal of narrowing the attainment gap in England, the EEF is considering methods to monitor the differential effects that interventions might have on pupils of varying levels of ability. A change in the dispersion of outcomes of the intervention group would be indicative of differential effects. Based on the results of this study, the EEF updated its Analysis Guidance to request evaluators to report the standard deviations by trial arm before and after the intervention.
Author: Sean Demack
EEF Research Paper Series, No. 003, May 2019
EEF response: This study examines some of the theoretical and empirical implications which account for how pupils are naturally organised in ‘classes’, which are in turn clustered within schools. The EEF recognises the contribution made by this study and invites evaluators to consider accounting for the ‘class’ clustering in their proposed designs. This paper provides suggestions of cases when this may be more relevant. Due to a limited number of studies included in this review, additional evidence is necessary.
Authors: Ben Smith, Andrew Boyle, Stephen Morris
EEF Research Paper Series, No. 004, February 2020
EEF response:National test results, such as GCSE grades, are an outcome measure for numerous EEF evaluations. However, any intervention aiming to improve grades must operate via improving students’ marks. This paper examines the impact an intervention could have in terms of improving students’ marks in their GCSE Science exams and implications for a trial’s MDES.
In a first step, the authors scrutinised GCSE papers to determine how many marks could plausibly be gained due to the intervention. Mark distributions were then simulated to assess what mark gain was likely in practice. Finally, implications for MDES and sample size calculations were examined.
This work has been conducted as part of the Deeper Thinking project and can inform MDES and sample size calculations of future EEF evaluations by considering the impact of marks.
Authors: Victoria Elliott, Ashmita Randhawa, Jenni Ingram, Lesley Nelson-Addy, Charles Griffin, Jo-Anne Baird (Department of Education, University of Oxford)
This is a mixed method review of practice which triangulates documentary analysis of feedback policies, a medium-scale online survey of primary and secondary teachers, and in-depth interviews with teachers at eight case study institutions. By doing this we aimed to gain a picture of what is happening in schools, how that relates to policy, what staff think about it, and also the lessons that may be gained for practice elsewhere.
Authors: Stephanie Waddell, Prof Jonathan Sharples
In this What Works Network Strategic Fund project, we aimed to develop and pilot an approach to mobilising research evidence that was informed by the behavioural needs of users. It was based on the premise that by understanding the current state of practice, in addition to the current state of the evidence base, What Works centres could better address the gaps between the two. It focused on mobilising a joint piece of evidence-based guidance from the Early Intervention Foundation (EIF) and the Education Endowment Foundation (EEF) on social and emotional learning (SEL).
Authors: Julie Nelson, Palak Mehta, Jonathan Sharples, Calum Davey
Report and Executive Summary, March 2017
Despite recent policies to support evidence-informed teaching, and a number of important practical developments we still don’t know a great deal about the current extent or depth of evidence-informed practice across schools in England. This paper presents findings from a survey co-developed by the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF), which captured information about this issue in late 2014.
The survey was developed to provide a measure of research engagement across a series of projects, funded by the EEF, which aim to increase schools’ awareness, and use, of research evidence. The survey has also informed the EEF’s overall approach to scaling-up and mobilising evidence – a key priority for the organisation in the second five years of its life.
It suggests that at this point, academic research was having only a small to moderate influence on decision making relative to other sources, despite teachers generally reporting a positive disposition towards research. Additionally, it suggests this positive disposition towards research, and perceptions of research engagement, were not necessarily transferring into an increased conceptual understanding of research knowledge.
Authors: Bilal Ashraf, Akansha Singh, Germaine Uwimpuhwe, Tahani Coolen-Maturi, Jochen Einbeck, Steve Higgins and Adetayo Kasim
EEF response: This study investigates the impact of EEF-funded trials on pupils eligible for free school meals. Although similar analysis is conducted during each individual evaluation, this report conducts a meta-analysis using data from 88 trials and over half a million pupils to reach conclusions. The report contributes to the evidence about what type of approaches may be effective at reducing the attainment gap, which influences the choice of interventions that EEF chooses to trial and scale up.