Education Endowment Foundation:EEF Blog:​Making Best Use of Teaching Assistants – what have we learned so far about supporting schools to put evidence into action?

EEF Blog:​Making Best Use of Teaching Assistants – what have we learned so far about supporting schools to put evidence into action?

Author
Jonathan Sharples
Jonathan Sharples
Professorial Research Fellow
Blog •4 minutes •

In this blog, senior researcher Prof. Jonathan Sharples explores what we’ve learned from the EEF’s campaign to support schools put into action the evidence of how best to deploy their teaching assistants. He looks at what we did; what we learned; and asks What’s next?…

In many ways the EEF is an unusual organisation. While a lot of our time is spent building the evidence base – generating new knowledge through trials and synthesising existing research – we also have a practical (and moral) mission: to support teachers and senior leaders in raising attainment and closing the disadvantage gap.

That purpose cannot be achieved merely by hoping evidence will find its way into the right hands in the right place at the right time. Simply put, producing and communicating evidence is valuable, but insufficient, for a charity with a mission to break the link between family income and educational achievement. A pro-active approach is required to support schools in yielding the potential of what we know about what works’ to improve teaching and learning

In 2012, Prof. Carol Campbell co-wrote a paper for the EEF that presented evidence-informed practice as a combination of factors relating to production, mediation and use, and that a strong evidence mobilisation plan would require attention to all three of these elements. She also suggested that in an increasingly decentralised education system, as we have in England, mobilising knowledge to 20,000 individual schools is not an easy task!”

This campaign was our first attempt to distil the best available evidence on the topic into a​guidance report​with practical recommendations for schools, and then to mobilise it using a range of different strategies.

At the risk of spoiling the ending, she was right on both accounts.

What we did

We have just published a series of independent evaluations of the EEF’s own initial efforts to scale up evidence-informed practices, including four reports on the Making Best Use of Teaching Assistants campaign.

This campaign was our first attempt to distil the best available evidence on the topic into a guidance report with practical recommendations for schools, and then to mobilise it using a range of different strategies.

It was also our first attempt at working regionally at scale – focusing our efforts on hundreds of schools in specific geographical locations – and in recruiting a range of partner organisations to work with us to bring our evidence to life.

As part of the Making Best Use of Teaching Assistants campaign we piloted two different approaches to mobilise the guidance, both involving working with regional partners:

  • The first regional pilot, conducted in South and West Yorkshire, involved commissioning a range of practice-based organisations to support schools in understanding and implementing the evidence (eg, multi-academy trusts (MATS), teaching school alliances (TSAs), local authorities (LAs)).
  • The second regional pilot, conducted across Lincolnshire, involved embedding the scale-up work in existing school improvement initiatives across the county (eg, existing regional networks).

Both approaches involved a range of practical engagement and implementation activities, including, conferences, training workshops, action planning activities, evidence-based interventions and school-to-school support.

A striking finding across these projects has been the appetite we have witnessed for evidence-informed practice.

What we learned

At the outset we commissioned independent evaluations of this work, to help us understand what has and hasn’t been effective. We’ve learned a lot along the way, and in a series of blogs over the next few weeks I’ll be sharing some of these insights.

A striking finding across these projects has been the appetite we have witnessed for evidence-informed practice. Levels of engagement were high – over 750 schools participated across the two projects – and levels of drop-out were generally low. My own view is that, in our increasing autonomous and fragmented school system, evidence-informed practice’ provides an apolitical improvement agenda around which schools can coalesce. All schools want to improve the quality of teaching and learning: focusing on evidence provides a common focus that sits outside of school structures and niches.

A second overarching finding is that developing evidence-informed practices at scale is possible, although challenging. Encouragingly, there are signs that the work in Yorkshire may have had a positive impact on pupil attainment across the county at Key Stage 2 in English (see here). This finding should be treated with caution due to the small size of impact measured and the moderate-to-low security of the result on the EEF rating scale. Nevertheless, it is promising to see signs of any impact on pupil outcomes when operating at this scale. At the same time, the small impacts represent the challenge and complexity we face in developing evidence-informed practice at scale.

What next?

Moving forward seven years, we have updated Prof. Campbell’s model in light of the insights that have emerged from the independent evaluations of our scale-up work. (Thanks in particular to Bronwen Maxwell and her team at Sheffield Hallam University for moving this on).

EEF scale up diagram Jun 2019
EEF_scale-up_diagram_-_Jun_2019.png

It posits that evidence-informed practice relies on the alignment of four key factors, as described below and shown in the diagram:

  1. the quality and usefulness of the evidence
  2. the receptiveness and capacity of schools as evidence users
  3. the presence of skilled research intermediaries
  4. the alignment with the wider school system.

Research use can fail at any of these points, meaning you are only as strong as the weakest link (see it more as a multiplication sum than an addition)

In the evaluations of the Making Best Use of Teaching Assistants campaign, it was found that the receptiveness and capacity of the schools to use the guidance and regional training was often a limiting factor in enabling changes in practices. This resonates with our wider observations that schools’ ability to translate and implement evidence is variable, and at times weak, across the sector.

As Prof. Campbell suggested in 2012, strong evidence mobilisation involves addressing all four factors collectively, as well as the interactions between them. Over the summer, we are looking at these four elements in more detail, and what we can collectively do to create an education system that is world-leading in its use of research.