Education Endowment Foundation:EEF Blog: Untangling the ​‘Literacy Octopus’ – three crucial lessons from the latest EEF evaluation

EEF Blog: Untangling the ​‘Literacy Octopus’ – three crucial lessons from the latest EEF evaluation

Professor Jonathan Sharples, Senior Researcher at the EEF, looks at the key messages from our latest evaluation report investigating different ways of disseminating research evidence to schools.
Jonathan Sharples
Jonathan Sharples
Professorial Research Fellow
Blog •6 minutes •

How can evidence improve practice in schools? Professor Jonathan Sharples, Senior Researcher at the EEF, looks at the key messages from our latest evaluation reports investigating different ways of disseminating research to make a difference in classrooms. 

Today we’ve published an evaluation of one of our most ambitious projects, the Literacy Octopus’. It provides plenty of food for thought for anyone interested in improving the way research evidence informs practice, not just in education, but across a range of sectors.

The Literacy Octopus’ project is a pair of large, multi-armed – hence the octopus’ – trials designed to evaluate different ways of engaging schools with a range of evidence-based resources and events. The common focus was on supporting literacy teaching and learning in primary schools.

The first trial tested whether simply sending schools evidence-based resources in a range of formats could have an impact on literacy outcomes – this included printed research summaries, practice guides, webinars and an online database. The second trial tested whether combining these resources with additional support to engage with them would have greater impact

In total, over 13,000 schools were involved in these two Literacy Octopus’ trials. Some schools were just sent evidence-based resources; while others received the resources along with additional light-touch support, such as invitations to twilight seminars on using the resources in the classroom. By testing different ways of engaging schools with the same evidence, the intention was to compare passive’ and active’ forms of research dissemination.

In what are some of the largest randomised controlled trials (RCTs) ever conducted in education, the evaluators, the National Foundation for Educational Research, found that none of the approaches had an impact on pupil attainment, nor on the likelihood of teachers to use research to inform their practice.

The findings of the dissemination’ trial, where schools were simply sent evidence-based resources, are perhaps not surprising. There have a been a few studies that indicate basic research communication can have a modest impact on practitioner’s behaviours, suggesting this was worth investigating. Nevertheless, there has been a growing recognition over the last 20 years that simply packaging and posting’ research is unlikely, by itself, to impact significantly on decision-making and behaviours. The findings today add further weight to this understanding, although in the form of much-needed empirical research

What this shows, I think, is that our notion of research use’ needs to extend beyond just communicating evidence – for example, publishing a report online – to looking at how it is effectively transformed and applied to practice. This message is particularly sobering, given that basic communication strategies still make up the majority of organisations’ efforts to mobilise research evidence, despite those organisations being aware of the limitations. This applies to all sectors, not just education.

So what about the second Literacy Octopus’ trial, the engagement’ one, which tested the impact of providing schools with some additional support to engage with the evidence-based resources, yet also failed to show an overall impact on teaching and learning?

A recent systematic review, published by my colleagues at the EPPI Centre at UCL’s Institute of Education last year, sheds some useful light on what might be going on. This review looked at six mechanisms that underpin a range of knowledge mobilisation’ interventions and how they impact on decision-making (eg, creating access to research, developing skills to use research). Importantly, in addition to reviewing different mechanisms to mobilise evidence, they also looked at the behavioural requirements that were necessary for those various approaches to have an impact. This included having: 

  1. opportunities to engage with the interventions
  2. the motivations to do so, and
  3. the skills and capabilities to understand and use the outputs.

Crucially, across all the different types of research use interventions they found that impacting on decision-making relied on also attending to these behavioural needs. For example, interventions that create access to research only appear to impact on decision-making if they are also combined with strategies that create opportunities and motivation for engaging with that evidence. Interventions that focus on building people’s skills to use evidence, for example through training and professional development, are conditional on also having the capabilities to act on it. Furthermore, it is often the use of multiple strategies – as opposed to single strategies – that influence decision-making, particularly where these approaches are embedded in existing structures and processes (e.g. school improvement or policy systems).

In light of these insights, the interventions in what we termed the active’ arms of the Literacy Octopus appear actually to be light touch. To what extent, for example, can attending a conference create the opportunities, motivations and skills to be able to do something with the evidence that was being presented? What further support, capacity, and conditions are needed for that evidence to gain traction on classroom and school improvement? 

A range of evaluations funded by the Education Endowment Foundation over the last few years illustrate a similar trend: that just exposing teachers to information about evidence-based practices is rarely sufficient in itself to improve teaching and learning, even if that information is underpinned by robust research. Projects such as Anglican Schools Partnership Effective Feedback, Research Champions, Teacher Effectiveness Enhancement Programme, and Challenge the Gap have all looked at high-impact strategies in our Teaching and Learning Toolkit, yet failed to see an impact on pupils’ learning outcomes

If we look at projects that do show promise, they often provide careful scaffolds and supports to help apply conceptual understandings to practical classroom behaviours and specific subject domains. Indeed, schools in the Literacy Octopus’ trials that did change their practice using the evidence that was presented, often appeared to do so through structured in-school collaboration and enquiry

Were we right to fund this project? I think we were, for three reasons

Firstly, the activities in the Literacy Octopus’ trials are typical of the type of things that are going on in the UK relating to research use and knowledge mobilisation, so are worth evaluating. That is not to say that these activities aren’t useful – you can’t use research, after all, if you don’t know about it – although the evidence suggests they should be seen as a necessary, but not sufficient, condition to practical research use

Secondly, as hinted at earlier, these evaluations add valuable evidence to our existing understanding of knowledge mobilisation. There is a striking paucity of robust, quantitative evaluations of interventions to support research use in schools, and these results extend and deepen the evidence base in this area

Thirdly, and perhaps most importantly, these findings provide useful insights that can shape future work

We take away three key lessons:

  1. Traditional communication and dissemination of research should be seen as just one strand of a multi-faceted approach to mobilising knowledge. Although traditional communication of research can provide a cost-efficient way of engaging a large number of schools, and create widespread awareness of a piece of work, it should be seen as a foundation for further activities, rather than a means to research use in itself.
  2. Projects and interventions that encourage an engagement with research need to provide better support for translation and adoption back in the school. For example, a growing body of evidence demonstrates the benefits of in-school coaching and mentoring in supporting changes in classroom behaviours – we should explore how these and other activities can be woven into projects that support research use in schools.
  3. We should continue to help build the general capacity and skills in the sector to use research as part of school improvement. This includes developing resources and processes to support evidence-informed school improvement, as well as creating wider readiness and incentives to use research by working with regional and national policy makers.

The EEF is being responsive to these insights. We have helped create the Research Schools Network, a group of schools across the country building regional capacity to find and use evidence. We have also developed scale-up campaigns – initially on Making best use of Teaching Assistants’ and primary literacy – that adopt a range of knowledge mobilisation strategies, such as policy engagement, sector-led training, actionable guidance reports, implementation resources, programme scale-up (in addition to traditional communication and dissemination). In the new year, we will be publishing our next EEF guidance report, A School’s Guide to Implementation’, that will unpack the latest research around effectively putting a decision or idea into effect.

In 2012, the EEF commissioned an excellent discussion paper which suggested that effective knowledge mobilisation is a product of interactions between research producers, mediators and users, and that strong knowledge mobilisation initiatives build capacity across all three elements concurrently. The findings from the Literacy Octopus’ suggest that there are few short cuts to achieving this objective