Education Endowment Foundation:EEF Blog: Fidelity vs. flexibility

EEF Blog: Fidelity vs. flexibility

James Richardson looks at the messages we can take the pilot programmes that have reported today.
Author
James Richardson
James Richardson
Blog •3 minutes •

James Richardson looks at what today’s evaluation reports tell us about how best to implement research in the classroom. 

Today we publish EEF-funded independent evaluations of three pilot programmes: Research Champions, Research into Practice and Powerful Learning Conversations. 

Pilot programmes are designed to test the promise of an idea and the feasibility of trialing it on a larger scale. Their reports have no effect sizes or security padlocks so we don’t expect to draw robust conclusions by evaluating them. But this can make their findings and messages difficult to communicate.

There may be no headline-grabbing results in today’s batch of evaluation reports, but look a little closer and you’ll find little gems of information about teachers’ changing practice and successful CPD

Take Research into Practice and Research Champion. We know from previous pilots, notably of the Anglican Partnership’s Effective Feedback, that navigating the gulf between research and practice can be a considerable challenge

Both programmes were funded as part of our £1.5m drive with the London Schools Excellence Fund and the Department for Education to improve the effectiveness of research use in schools. Both aimed to do this by assigning a designated research lead to each participating school

In Research Champion, a senior teacher based at one of the five participating schools became their research champion’ and worked to promote staff engagement with research. But even with dedicated symposia to discuss and unpick findings, the evaluators from NatCen Social Research found the impact on classroom practice and teachers’ attitudes towards research to be minimal

Research into Practice, however, showed more promise. Teachers received half-termly training sessions with a Research into Practice lead’ that focused on a particular area of research and worked collaboratively to apply specific interventions to real classroom issues.The evaluation reported a statistically significant increase in teachers’ positive attitudes towards academic research at the end of the programme, as well as an increase in the proportions of teachers who said they felt able to relate research to their own contexts and use information from research to implement new approaches in the classroom.

On the surface it may seem hard to tease-out a take-home message from these two reports. But taken in the context of our current understanding of effective CPD, we can get clear direction on how to make sure teachers can get maximum value from research

It would seem that structured and bespoke support for teachers, focusing on specific actions for implementation and in-class support is a necessary (if not sufficient) condition for making sure research has an impact. Importantly, the independent evaluations noted that the engagement of senior leaders in Research into Practice was critical to its success; they noted that time constraints affected teachers’ ability to commit to the Research Champion model

The importance of dedicated and specific resources to support teachers to implement research in the classroom is confirmed by today’s evaluation of our third pilot programme, Powerful Learning Conversations. 

This intervention takes insights from feedback in sports coaching and applies them to English and maths teaching in Key Stage 3. Its based on the idea that techniques used for successful feedback in sports coaching may be transferable to feedback for pupils in classrooms.

Finding mixed results, the evaluators noted that teachers were not always clear on how to operationalise’ the theory of immediate verbal feedback that was central to the programme’s design. Not defining the intervention tightly and specifically meant that it wasn’t implemented at all. Without clear tools to support implementation, some teachers abandoned the intervention completely.

So when we talk about implementing research, what we are really talking about is how to structure good CPD. If research is to have any use at all, we must understand not just the effect of the intervention itself, but how we can support the process of change. The publication of today’s evaluation reports means we have many more nuggets of information on how to do this successfully and strike the delicate balance between flexibility and fidelity.