Education Endowment Foundation:The Literacy Octopus: Communicating and Engaging with Research – trial

The Literacy Octopus: Communicating and Engaging with Research – trial

Multiple delivery teams
Implementation costThe cost estimates in the Toolkits are based on the average cost of delivering the intervention.
Evidence strengthThis rating provides an overall estimate of the robustness of the evidence, to help support professional decision-making in schools.
Impact (months)The impact measure shows the number of additional months of progress made, on average, by children and young people who received the intervention, compared to similar children and young people who did not.
0
months
Project info

Independent Evaluator

NFER logo
NFER
A large multi-arm randomised controlled trial, investigating a range of different methods of communicating research to schools and engaging them in research evidence.
Schools: 13000 Grant: £630,000
Key Stage: 2 Duration: 3 year(s) 5 month(s) Type of Trial: Effectiveness level evidence
Completed December 2017

The two Literacy Octopus’ trials – named after their multi-armed design – drew on a wide range of evidence-based resources and events designed to support the teaching and learning of literacy at Key Stage 2. They were delivered by four partners with extensive expertise in education research, and included printed and online research summaries, evidence-based practice guides, webinars, face-to-face CPD events, and access to online tools. The trials were funded jointly with the Department for Education and the Mayor’s London Schools Excellence Fund.

Translating evidence-based resources into real change in the classroom is difficult. Simply sending resources to schools is often thought to have no effect, even when those resources are high quality and evidence-based. However, there have been specific examples where putting hard copies of practical guides in the hands of professionals appears to have made a difference to practice, and at minimal cost. So the first trial tested whether sending schools high quality evidence-based resources in a range of different formats could have an impact on pupil outcomes.

The second trial tested whether combining the provision of resources with light-touch support on how to use them would have greater impact. Some schools were simply sent evidence-based resources, while others received the resources along with simple additional support, such as invites to seminars on applying the resources in the classroom. As well as pupil outcomes, this trial also measured teachers’ use of research, to help us understand the impact on teacher behaviour

Neither trial found evidence of improved literacy attainment at Key Stage 2 and the second trial found no increase in teachers’ use of research. These high quality trials tested a wide range of interventions in a large number of schools. The results suggest that, in general, light-touch interventions and resources alone are unlikely to make a difference.

This has important implications for organisations which are using evidence to improve teaching and learning. At EEF we have already started to take a more intensive approach to scale-up, with more structured support for schools and a greater focus on implementation. Our campaigns and Research Schools network are good examples of this. Importantly, we will continue to evaluate the impact of this activity.

The interventions

The first trial tested the following interventions. The Research Results table below shows the results for each.

  • 1A Improving Reading’ guide – Evidence-based printed booklet Improving Reading: A Guide for Teachers. From CEM, part of Durham University.
  • 2A Evidence updates & website – Regular printed and electronic materials, including the Better Evidence-based Education magazine and the Best Evidence in Brief’ email, and access to a searchable database, Evidence 4 Impact. From IEE at the University of York.
  • 3A Webinar – A link to the archived webinar and materials from a conference on research evidence relating to KS2 literacy. From ResearchEd, in collaboration with NatCen.
  • 4A Teaching How2s website – Access to the Teaching How2s website, including evidence-based visual guides on CPD. From Campaign for Learning/​Train Visual.

The second trial tested the following interventions. The Research Results table below shows the results for each.

  • 1B Evidence updates & website – Regular printed and electronic materials, including the Better Evidence-based Education magazine and the Best Evidence in Brief’ email, and access to a searchable database, Evidence 4 Impact. From IEE at the University of York.
  • 2B Evidence updates & website + event – Intervention 1B plus invitation to an evidence-based literacy programmes event.
  • 3B Teaching How2s website – Access to Teaching How2s website, including evidence-based visual guides on CPD. From Campaign for Learning/​Train Visual
  • 4B Teaching How2s website + intro event – Intervention 3B plus invitation to a one-day introduction to using the Teaching How2s website, and updates on using the visual guides.
  • 5B Improving Reading’ guide & emails – Hard copy evidence-based KS2 literacy teaching materials, including evidence-based printed booklet Improving Reading: A Guide for Teachers and monthly classroom activity posters. From CEM, part of Durham University.
  • 6B Improving Reading’ & emails + CPD session – Intervention 5B plus invitation to one twilight CPD session.
  • 7B Improving Reading’ & emails + CPD sessions & tools – Intervention 6B plus invitation to one further twilight CPD session, use of pupil diagnostic tools, and teacher peer observation between sessions.
  • 8B Conference – Invitation to free Saturday conference on research evidence relating to Key Stage 2 literacy. From ResearchEd, in collaboration with NatCen.
  • 9B Conference + webinars – Intervention 8B plus invitation to two webinars to provide support on applying the research from the conference in schools.
  1. The first trial found no evidence that Literacy Octopus passive dissemination interventions improved pupils’ Key Stage 2 English scores compared with the control group. The five padlock security rating means we have very high confidence in this result.
  2. These findings suggest that simply disseminating research summaries and evidence-based resources to schools is not an effective way for research organisations to support schools to improve pupil outcomes.
  3. It is likely that these materials formed a small part of the information received by schools during this time. It is possible that if schools had support to navigate and prioritise that information, greater impact could be achieved. Alternatively, schools may need more support in transforming such materials into actual change in the classroom.
  4. The evaluation team have analysed the Key Stage 2 results from a further year following the project to explore if there are longer-term effects of the interventions. These results are published in addenda to the main reports.
  5. The second trial found no evidence that any of the interventions improved pupils’ Key Stage 2 English scores. The five padlock security rating means we have high confidence in this result.
  6. There was no evidence of impact on any of the six teacher Research Use Measures used in this trial. However, we have limited confidence in this result given the low response rate to the questionnaires designed to capture these outcomes, and some measures were only moderately reliable.
  7. Schools’ level of engagement varied: six out of ten schools did not engage to the level expected by the providers, although a small proportion engaged to a greater extent than expected (for example by hosting CPD sessions). Reasons for not engaging included lack of time, the timing and location of events, and a preference for face-to-face support rather than online or remote formats only.
  8. Teachers felt research evidence was most effectively communicated when it was interactive, accessible, relevant, included a balanced and credible discussion of the evidence, and focused on how to apply the evidence in practice. Where schools went on to implement changes in light of the interventions, these came about through mechanisms such as in-school collaboration, further enquiry, and trying out, reviewing, adapting, and embedding approaches.
  9. The lack of impact across the different interventions suggests that simply communicating research evidence to schools is not enough to improve outcomes. How easily the presented evidence can be used in practice — and the conditions in schools for implementing evidence-based change — might be just as important. Further research should assess whether interventions can transform evidence into practical action, and develop supportive implementation conditions in schools.
Outcome/​Group
ImpactThe size of the difference between pupils in this trial and other pupils
SecurityHow confident are we in this result?
1A ‘Improving Reading’ guide
0
Months' progress
2A Evidence updates & website
0
Months' progress
3A Webinar
0
Months' progress
4A Teaching How2s website
0
Months' progress
1A ‘Improving Reading’ guide, FSM
0
Months' progress
N/A
2A Evidence updates & website, FSM
0
Months' progress
N/A
3A Webinar, FSM
0
Months' progress
N/A
4A Teaching How2s website, FSM
0
Months' progress
N/A
1B Evidence updates & website
0
Months' progress
2B Evidence updates & website + event
0
Months' progress
3B Teaching How2s website
0
Months' progress
4B Teaching How2s website + intro event
0
Months' progress
5B ‘Improving Reading’ guide & emails
0
Months' progress
6B ‘Improving Reading’ & emails + CPD session
0
Months' progress
7B ‘Improving Reading’ & emails + CPD sessions & tools
0
Months' progress
8B Conference
0
Months' progress
9B Conference + webinars
0
Months' progress
1B Evidence updates & website, FSM
0
Months' progress
N/A
2B Evidence updates & website + event, FSM
-1
Months' progress
N/A
3B Teaching How2s website, FSM
-1
Months' progress
N/A
4B Teaching How2s website + intro event, FSM
0
Months' progress
N/A
5B ‘Improving Reading’ guide & emails, FSM
0
Months' progress
N/A
6B ‘Improving Reading’ & emails + CPD session, FSM
0
Months' progress
N/A
7B ‘Improving Reading’ & emails + CPD sessions & tools, FSM
+1
Months' progress
N/A
8B Conference, FSM
0
Months' progress
N/A
9B Conference + webinars, FSM
0
Months' progress
N/A