EEF Research Papers

The EEF Research Paper Series is a collection of open access research papers relevant to EEF evaluations and the wider research community.

The papers are not externally peer-reviewed. They are published electronically and are freely available online or through email distribution.

For each paper, we publish an 'EEF response', briefly setting out how we are using these findings to inform our future work.


Teachers_engagement_with_research

Teachers' engagement with research: what do we know? A research briefing

Authors: Matt Walker, Julie Nelson, Sally Bradshaw, Chris Brown 

Research Brief May 2019

This research briefing summarises findings from a nationally representative survey of schools and teachers, which investigated teachers’ research use. The survey was designed with reference to the principles adopted in an earlier 2014 study in which the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF) developed a Research Use Survey (RUS) (Nelson et al., 2017). The new survey took the most effective elements of the RUS, and augmented these with recent knowledge about research engagement and use. The findings are based on survey results from 1,670 teachers in England. The survey was administered between 19 September and 12 November 2017. 

Properties_of_commercial_tests_in_the_EEF

Properties of commercial tests in the EEF

Authors: Rebecca Allen, John Jerrim, Meenakshi Parameshwaran, Dave Thompson

EEF Research Paper Series, No. 001, February 2018

EEF response: This study analyses some of the properties of major commercial assessments used by the EEF in large trials. We have used the results of this study to update some of the assumptions fed into sample size calculations and inform the choice of appropriate and cost-effective commercial assessments to be used as pre-tests. 

Standard_Deviation_as_an_outcome_on_interventions

Standard Deviation as an outcome on interventions: a methodological investigation

Authors: Peter Tymms, Adetayo Kasim

EEF Research Paper Series, No. 002, February 2018

EEF response: In line with its goal of narrowing the attainment gap in England, the EEF is considering methods to monitor the differential effects that interventions might have on pupils of varying levels of ability. A change in the dispersion of outcomes of the intervention group would be indicative of differential effects. Based on the results of this study, the EEF updated its Analysis Guidance to request evaluators to report the standard deviations by trial arm before and after the intervention. 

EEF_Research_Paper_2019-05-14_100402

Does the classroom level matter in the design of educational trials? A theoretical & empirical review.

Author: Sean Demack

EEF Research Paper Series, No. 003, May 2019

EEF response: This study examines some of the theoretical and empirical implications which account for how pupils are naturally organised in ‘classes’, which are in turn clustered within schools. The EEF recognises the contribution made by this study and invites evaluators to consider accounting for the ‘class’ clustering in their proposed designs. This paper provides suggestions of cases when this may be more relevant. Due to a limited number of studies included in this review, additional evidence is necessary.

GCSE_Science_paper-01

GCSE science as an outcome measure: the capacity of the Deeper Thinking intervention to improve GCSE science grades

Authors: Ben Smith, Andrew Boyle, Stephen Morris

EEF Research Paper Series, No. 004, February 2020

EEF response:National test results, such as GCSE grades, are an outcome measure for numerous EEF evaluations. However, any intervention aiming to improve grades must operate via improving students’ marks. This paper examines the impact an intervention could have in terms of improving students’ marks in their GCSE Science exams and implications for a trial’s MDES. 

In a first step, the authors scrutinised GCSE papers to determine how many marks could plausibly be gained due to the intervention. Mark distributions were then simulated to assess what mark gain was likely in practice. Finally, implications for MDES and sample size calculations were examined. 

This work has been conducted as part of the Deeper Thinking project and can inform MDES and sample size calculations of future EEF evaluations by considering the impact of marks. 

EEF_Feedback_Practice_Review_Page_001

Feedback: Practice Review

Authors: Victoria Elliott, Ashmita Randhawa, Jenni Ingram, Lesley Nelson-Addy, Charles Griffin, Jo-Anne Baird (Department of Education, University of Oxford)

This is a mixed method review of practice which triangulates documentary analysis of feedback policies, a medium-scale online survey of primary and secondary teachers, and in-depth interviews with teachers at eight case study institutions. By doing this we aimed to gain a picture of what is happening in schools, how that relates to policy, what staff think about it, and also the lessons that may be gained for practice elsewhere.

developing-a-behavioural-approach-to-knowledge-mobilisation_Page_01

Developing a behavioural approach to knowledge mobilisation: Reflections for the What Works Network

Authors: Stephanie Waddell, Prof Jonathan Sharples

In this What Works Network Strategic Fund project, we aimed to develop and pilot an approach to mobilising research evidence that was informed by the behavioural needs of users. It was based on the premise that by understanding the current state of practice, in addition to the current state of the evidence base, What Works centres could better address the gaps between the two. It focused on mobilising a joint piece of evidence-based guidance from the Early Intervention Foundation (EIF) and the Education Endowment Foundation (EEF) on social and emotional learning (SEL).

Measuring_Teacher_Research_Engagement_-_Findings_from_a_pilot_study

Measuring Teachers' Research Engagement: Finding from a pilot study

Authors: Julie Nelson, Palak Mehta, Jonathan Sharples, Calum Davey
Report and Executive Summary, March 2017

Despite recent policies to support evidence-informed teaching, and a number of important practical developments we still don’t know a great deal about the current extent or depth of evidence-informed practice across schools in England. This paper presents findings from a survey co-developed by the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF), which captured information about this issue in late 2014.

The survey was developed to provide a measure of research engagement across a series of projects, funded by the EEF, which aim to increase schools’ awareness, and use, of research evidence. The survey has also informed the EEF’s overall approach to scaling-up and mobilising evidence – a key priority for the organisation in the second five years of its life.

It suggests that at this point, academic research was having only a small to moderate influence on decision making relative to other sources, despite teachers generally reporting a positive disposition towards research. Additionally, it suggests this positive disposition towards research, and perceptions of research engagement, were not necessarily transferring into an increased conceptual understanding of research knowledge.