Education Endowment Foundation:EEF Blog: Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings

EEF Blog: Today’s findings; impact, no-impact and inconclusive – a normal distribution of findings

Author
EEF
EEF
Blog •2 minutes •

James Richardson, Senior Analyst at the EEF, discusses findings from the latest completed EEF projects.

Each new batch of published EEF evaluations brings important new messages on the impact of different approaches to learning. Today’s nine reports are no exception. The low-cost approach of Accelerated Reader – a computer programme that encourages children to read independently through rewards and quizzes, matching their reading ability with appropriate texts – suggests that more expensive isn’t always better. Children following the approach made three additional months’ progress compared to their peers who did not.

Another finding of note is the small positive impact of teaching children fewer mathematical concepts, but covering them in greater depth to ensure mastery’. The EEF’s evaluation of Mathematics Mastery will make fascinating reading for headteachers contemplating introducing this approach into their school. Of course, the true value of this method may only be evident in years to come as children are able to draw on their secure mathematical foundations to tackle more complex problems.

Naturally, each set of published reports also contains results that find no impact’. This is the nature of research and we should not hide from the challenge of narrowing the gap. We are also learning more about conducting randomised trials in schools, and not every study shows a clear result. This is frustrating, but each study result is valuable in helping us understand how research can inform our approach to teaching and learning.

We are also learning more about the methods of evaluation and the prospect of schools running their own trials. The Accelerated Reader report, along with Fresh Start, part of Read, Write, Inc’s suite of interventions, was a product of 14 schools, each running the interventions independently of each other and the results aggregated by the evaluation team at Durham University to form two robust impact evaluations. The lead evaluator, Professor Stephen Gorard, concludes that the success of these projects demonstrates that schools can run trials of their own, provided that they are aided by an expert who can oversee the randomisation and advise on the analysis. This has exciting implications, especially for those who have used our DIY Evaluation Guide to set up a trial in their own school and may consider collaborating with other schools to run larger trials.

Important though these independent evaluations are, and especially so for popular commercial interventions, we recognise that each evaluation report has greater worth if accumulated with other similarly robust research.

Our commitment as an organisation is not only to build the strength of the evidence base in education, across key stages, topics, approaches and techniques, but also ensure that the key messages emerging from the research are synthesised and communicated clearly to teachers and school leaders so that evidence can form a central pillar of how decisions are made in schools.

We have already begun this work, driven by the messages from our published trials as well as the existing evidence base. How teaching assistants can be used to best effect, important lessons in literacy at the transition from primary to secondary, and which principles should underpin approaches on encouraging children in reading for pleasure are all issues that have important implications for school leaders. Synthesising and disseminating these vital messages will form the backbone of a new phase of EEF work beginning later in the year