Embedding Formative Assessment (re-grant)

This project and its evaluation were affected by the 2020 and 2021 partial school closures caused by the Covid-19 pandemic. As a result, the delivery has been extended and outcome data collection will be later than planned. The evaluation protocol is being updated and will be published here as soon as possible.

Embedding Formative Assessment (EFA) is a professional development programme delivered by SSAT which aims to improve pupil outcomes by embedding the use of formative assessment strategies across a school. Schools receive detailed resource packs to run monthly workshops, known as Teacher Learning Communities (TLC), and teachers conduct structured peer observations focusing on the use of formative assessment strategies.

Each monthly TLC lasts 75–90 minutes. All teaching staff are involved and split into groups comprising 8–14 people. TLC agendas and materials focus on five key formative assessment strategies: ‘clarifying, sharing and understanding learning intentions’; ‘engineering effective classroom discussions and activities’; ‘providing feedback that moves learning forward’; ‘activating learners as instructional resources for one another’; and ‘activating learners as owners of their own learning’. Within each of these high-level concepts, the TLC handouts introduce multiple formative assessment techniques for teachers to consider.

Each school appoints a lead teacher who co-ordinates an in-school training day for senior leaders and the schools Teacher Learning Community leaders. Following this initial day the school lead receives ongoing implementation support from an EFA mentor. This includes a mixture of visits, phone calls and emails.

Why are we funding it?

The programme was developed based on existing evidence that formative assessment can improve students’ learning. Many schools already prioritise formative assessment, but often report that it can be challenging to implement. EFA provides schools with a structured approach to developing their formative assessment practices.

A previous EEF effectiveness trial in 140 schools had promising results, with students in the EFA schools making the equivalent of two months’ additional progress on their Attainment 8 GCSE scores. The result of this trial had a very high security rating.

As a result of the promising effectiveness trial, EEF would now like to provide some funding to SSAT to enable them to reach a greater number of schools. This project would provide SSAT with core funding to ensure that they have the necessary capacity and processes in place to ensure the future availability of EFA to more schools.

How are we evaluating it?

A team from the Behavioural Insights Team, led by Alex Sutherland, will be evaluating this project. The evaluation will aim to understand the scaling process and quality of implementation as EFA moves to a larger scale.

The evaluation will be composed of two main elements:

  1. Assessing the scale-up of the EFA programme, including the facilitators and barriers to scaling, fidelity to the programme and how this changes as it is scaled.
  2. Generating formative insights on scale-up for SSAT, but also the EEF when they are supporting the scale-up of other education interventions.

The evaluation will employ mixed methods, including surveys and case studies.

In addition to the scale-up evaluation, the EEF has commissioned BIT to undertake a mailer RCT to test whether certain marketing messages can encourage the adoption of an evidence-based programme by school leaders. This two-arm trial aims to explore whether testimony or evidence-based marketing material will result in differential expressions of interest (EOIs) and sales of EFA. The study sample includes 2000 schools that do not meet the exclusion criteria and that will be randomly assigned into two treatment groups. 

These groups will receive a letter and a supplemental programme description for EFA that was either constructed using solely testimonial (from a previous case study) or evidence-based support (from the previous EEF-funded evaluation). The results of this study would enable us to make recommendations about how to improve design of school recruitment materials, as well as make a contribution to the wider knowledge about the adoption of evidence-based practices at scale.

When will the evaluation report be due?

The scale-up evaluation report will be published in Spring 2024. The mailer trial report will be published in Winter 2021.