EEF Blog:  The “London effect” and the need for evaluation

Eleanor Stringer, Grants Manager at the EEF, considers the implications of a new IFS report.

A fascinating new report by the IFS* for the Social Mobility and Child Poverty Commission tries to unravel the “London effect”. Why is it that disadvantaged pupils in London perform so well academically? For example, nearly 50% of pupils on free schools meals gained 5 or more GCSEs at grades A*-C in 2013, whereas just 30% of similar pupils in the rest of the south of England did this.

The unsurprising answer is that there are several factors that seem to come into play, including differences in demographics. But when most variables are controlled for, it looks like improvements attainment in primary school between 1999 and 2003 were a huge factor in increasing attainment in secondary schools.

There is lots of discussion in the blogosphere about the implications of this finding. Does it mean that all the plaudits given to programmes such as Teach First and London Challenge, both of which came later and were targeted at secondary schools, are undeserved? Does this vindicate the much-maligned National Strategies, programmes designed to improve literacy and numeracy in the late 1990s?

The honest answer is “We don’t know”. The IFS have done a great job of trying to disentangle the impact of different variables. But without the results of rigorous evaluations of the different interventions, we’ll always be making educated guesses.

This is why at the EEF we focus on using the most robust evaluation methodologies possible when setting up our grants. Wherever possible, this means randomising schools or pupils, so that you have two similar groups of participants, one receiving the new intervention, the other acting as a control group. Sometimes this isn’t possible. For example, the study we’re undertaking of the intervention that is most similar to London Challenge – Challenge the Gap – is at such a scale and complexity that randomisation wasn’t feasible, and instead similar schools will be identified statistically to find a “matched” control group.

The report’s findings also chime with our experience of trying to improve the reading skills of 11 year olds, as outlined in our Interim Evidence Brief on Friday. Helping poor readers at the end of primary school, or the beginning of secondary school, is hard work. Intervening earlier to stop them falling behind in the first place seems to be a much better strategy.

We hope that in years to come, as our work grows and as interventions which are shown to be effective are taken up at scale, the attainment of poorer pupils will continue to rise. But when this happens, we’ll have a much stronger idea about which elements made a difference, due to the rigour and breadth of research about the different programmes.

*Disclaimer: The IFS is one of the EEF’s independent evaluators, and are currently undertaking 5 evaluations of EEF-funded projects