Education Endowment Foundation:EEF blog: early stage development – explaining the rationale

EEF blog: early stage development – explaining the rationale

Author
Stuart Mathers
Stuart Mathers
Head of Evidence Mobilisation

Stuart Mathers introduces our early-stage development work

Blog •4 minutes •

Over its first 11 years we’ve focused much of our evaluation activity on existing programmes, looking to understand whether a programme has the potential to support schools and teachers to improve outcomes for socio-economically disadvantaged pupils.

We’ve learnt a lot from doing this. But it’s become increasingly clear that there are areas of practice with promising evidence where there aren’t enough programmes that can help schools and other education settings to put them to use. As we’ve increased our understanding of the evidence around professional development and implementation we’ve wondered whether it would be possible to fill these evidence-to-practice gaps by supporting the development of new programmes.

In the 2021 – 22 academic year we worked with Dartington Service Design Lab, drawing on their Rapid Cycle Design and Testing approach to establish a structured process to support the development of new programmes. In this first year, nine Research Schools successfully applied to design, deliver and collect feedback on a new programme within five promising areas of evidence. These included for example, supporting the explicit teaching and practise of reading fluency at KS2 and supporting the adoption of explicit teaching of reading comprehension strategies in subjects other than English at Key Stage 3.

Each team worked on one area of evidence and used their knowledge and expertise to design what these approaches might look like in practice. For example, East London Research School considered how storybooks and the process of interactive reading could be enacted in the classroom for Reception children.

Then they considered how to support educators to adopt these practices, creating a theory of change as a tool to set out their overall programme design. For some, this included training sessions for teachers and TAs, or the provision of resources and supports for lesson planning. The aim was for this to be a thoughtful and precise consideration of what was necessary to achieve the desired change.

We also worked with the UCL Centre for Behaviour Change, to incorporate some of the wider evidence around effective intervention design, drawing on Behaviour Change Wheel principles.

When the overall design was in place, the teams tackled the more detailed creation of materials and resources, training content and plans etc. Each Research School delivered their programme to 8 – 10 schools and, crucially, collected formative feedback and data to allow them to understand:

  • Whether schools implemented as intended,
  • Whether it was acceptable and practical to use in schools, and
  • Whether the programme design was clear and logical.

Different teams used a variety of tools to collect this feedback in order to build a rich understanding of schools’ experiences. These included surveys, weekly delivery logs, and structured focus groups.

Unusually for the EEF, this didn’t include independent evaluation. At this stage the projects are too early in their development to justify this. Instead, we focused on supporting Research Schools to collect feedback that would aid iterative development ahead of independent evaluation further up the pipeline –at pilot stage and beyond.

The feedback from Research Schools and participating schools has been positive, particularly about the opportunity to be involved in developing new programmes. We have also learnt a lot from the first cycle of this work and are positive about this as an ongoing strand of the EEF’s work.

An independent evaluation of the effectiveness of EEF’s evidence pipeline also considered the introduction of early-stage programme development. It is due to publish in early 2023.

Given this we will be supporting more early-stage programme development over the next three years, extending the opportunity to developers beyond the Research School Network, initially through the latest funding round. We expect the process of design and codification to be iterative. To reflect this, we will have two entry points for support:

Innovation- for specific areas of the evidence where we are keen to support the development of new programmes from scratch and Development- for areas identified through our Research agenda where developers may have programmes that have been delivered at a very small scale that would benefit from refinement and codification.

We’ve got two main aims for this new strand of our work. First, we want to continue to learn from – and work with – others in the sector to develop and incubate new projects. Second, we want to develop a suite of resources for others to use and learn from at some point in 2023.

Over time, this new approach will support the development of more evidence-informed programmes that can be rigorously evaluated, ultimately offering more options for settings and educators looking to improve outcomes for all children and young people, especially the most disadvantaged.