Education Endowment Foundation:EEF Blog: Teacher Choices Trials – our new approach to researching questions teachers want answers to

EEF Blog: Teacher Choices Trials – our new approach to researching questions teachers want answers to

Author
Professor Rob Coe
Professor Rob Coe
Senior Associate
Blog •8 minutes •

What is the most effective way to start a lesson? Does asking your class to line up outside the door improve behaviour? What is the most effective way to read with your class? Prof. Rob Coe, senior associate at the EEF, introduces our latest initiative to generate evidence teachers can put to practical use to improve their classroom practice…

What has the EEF ever done for us?

Since the Education Endowment Foundation began in 2011, there has been a transformation of the landscape of evidence generation, synthesis and mobilisation in England. 

To date, robust evaluations of over 190 programmes have been commissioned by the EEF, the evidence synthesis presented in its Teaching and Learning Toolkit is widely seen as a trusted source of accessible evidence by teachers and school leaders [1], EEF guidance reports are being accessed by teachers many thousands of times a month, and the Research Schools Network is becoming established as a driving force in the mobilisation of evidence-informed practice. 

Today I seldom encounter an audience of teachers in England where people are not aware of and interested in research evidence…

That there has been a dramatic change in the status of research evidence in debates about teaching seems hard to deny. Before the EEF began, only a handful of researchers in England had actually done randomised controlled trials (RCTs) in schools and many education researchers argued that RCTs were inappropriate or immoral. 

My own perception of this change is of course anecdotal and selective, but it is quite dramatic.

Before 2011, if you did or advocated RCTs in education you were part of a marginalised tiny minority within the research community, struggling to get funding or even get work published in mainstream journals. When I spoke to groups of teachers it was always a hard sell to make the case that education research had anything to offer a classroom practitioner. If you tried to persuade them that something intuitively obvious but in conflict with solid evidence (eg, learning styles or class size) might be wrong, you pretty much lost the argument consistently. 

Today there have been hundreds of trials in schools, supported by the EEF and by many other funders: over half the schools in England have participated in an EEF trial. Hundreds of researchers have designed, analysed & reported those RCTs, so the depth and breadth of expertise is incomparably greater. 

Almost no-one now says it is unethical to do RCTs or that you can’t do them in schools. There is some debate still about how many (or what balance) we should have and what they can or can’t tell us – which is entirely appropriate, and something we must actively encourage.

I know that the teachers who listen to me are more likely to know and welcome the kinds of things I will say, but today I seldom encounter an audience of teachers in England where people are not aware of and interested in research evidence and some who have thought very deeply about how it influences their practice.

Of course, the EEF is far from the only organisation responsible for this change; anyone interested in research evidence knows that correlation is not causation and that attribution is tricky.

The growth of Twitter and blogging, for example, have opened up channels for debate and sharing and certainly contributed.

In particular, researchEd provides vibrant forums for teachers to engage with research at numerous events around the world and has undoubtedly had a substantial impact on the awareness and status of research evidence in both policy and practice. 

Many other organisations, representing funders, researchers and teachers, have also contributed, including (but not limited to) Wellcome, EDT, CCT, CUREE, NFER, TDT, CEM and EBE. (Who says education has too many acronyms?)

Crucially, these organisations are not competing with each other for the credit for promoting the change, but mostly collaborating, aligning their efforts and welcoming each other’s contributions. For anyone who remembers the low status of research evidence in education practice and policy ten years ago, these are exciting times.

Making research relevant to teachers

Despite the positive story of change in the availability and acceptance of evidence use in schools, there remain plenty of challenges.

Teachers and headteachers are keen for the EEF to answer research questions which can directly feed into existing teaching practice.

One specific challenge that I want to discuss here is the perception that much of this research is not relevant to practitioners. Of course, there are plenty of other challenges and we intend to discuss some of these in a later blog, but relevance is a good place to start!

Teachers and headteachers are keen for the EEF to answer research questions which can directly feed into existing teaching practice.

These questions are often not related to the impact of the kinds of manualised programmes, which require schools to purchase particular resources or training, that have typically been evaluated by the EEF. 

Instead, they relate to the everyday choices that teachers have to make when planning their lessons and supporting their students. Is it possible that these everyday decisions could be informed by evidence from direct comparisons of the options?

In making decisions about their practice, teachers are still much more likely to be influenced by their own ideas or ideas from other schools than by any kind of research evidence. Part of the reason for this may be that research does not appear to answer the questions they want to ask. 

There is room for genuine debate about whether research can ever in fact answer these questions. Can research give simple and clear guidance to teachers about what they should do? Or will it always be more complicated than that?

Teachers, like all human beings, are much more likely to accept as true the results of studies that confirm what they already believed. Evidence alone seldom changes people’s minds.

There is also a tendency to believe that research claims are obvious once we hear the result, even though we might have been unable – or unwilling – to predict them beforehand. These cognitive biases may be another part of the explanation of why research is sometimes perceived as not relevant.

A new EEF approach: Teacher Choices Trials

Teacher Choices trials is a strand of work begun by EEF in early 2019. This is the reason I joined the EEF in February and the main project I have been working on. 

The aim is to identify and evaluate the impact of direct choices teachers make in their own classrooms. If there are choices that make a difference, then we want to find them and evaluate their impact robustly.

We do not yet know what kinds of choices can be presented and evaluated in such a simple way: that is part of the research agenda to find out. But we are clear that many teachers want answers to these kinds of questions and that many believe some options will be genuinely and clearly better than others.

Our aim is to curate the questions teachers want to ask and generate and share the evidence to answer them.

The aim is to identify and evaluate the impact of direct choices teachers make in their own classrooms.

It is important to note that his work builds on some existing good work in this area. For example, the Institute for Effective Education in York has funded and coordinated a number of teacher-led trials. Kyra Research School has also produced its own teacher-led research programme. Richard Churches at the Education Development Trust has been prolific in training and supporting teachers to conduct trials of choices about their practices [2].

Of course, teacher-led research presents a number of challenges to the aim of generating robust and generalisable claims about impact. 

One of the distinctive features of the evaluations commissioned by the EEF is their rigour: this has been an uncompromising aim for the organisation. For the most part, teachers are not experts on research methodology, experimental design or analysis. Without extensive training they are unlikely to be able to design, carry out and analyse rigorous evaluations.

While we fully recognise the value of evaluation evidence that may not generalise beyond the context in which it was done (“Did it work in my school?”) and the learning about the process of conducting research and participating in decisions about research design that participants will gain from the process, we also want the EEF’s Teacher Choices trials to generate robust, generalisable evidence. 

In our work we want teachers to lead on the framing of the research questions and identifying relevant contexts, as well as carrying out the trial, collecting data, etc, but will get specialists to do the design of the evaluation and data analysis. For the first phase of the work we have commissioned a team from the National Foundation for Educational Research (NFER) to do this.

Among the questions teachers might ask are: 

  • What is the most effective way to start a lesson? 
  • What is the most effective way to read with your class? 
  • Does asking your class to line up outside the door improve behaviour?

We are keen to hear other suggestions!

Our first trial: a comparison of different ways of starting a lesson

After consultation with teachers, we have chosen to make the first Teacher Choices trial a comparison of different ways of starting a lesson. Specifically, we will compare starting a lesson with a retrieval quiz with starting with a discussion to engage interest.

Retrieval practice is a good first choice because it is well supported by a wealth of laboratory evidence and some school-based studies. It is also popular and widely advocated among research-aware teachers. Indeed, some would say that the evidence is already clear: if we want students to remember stuff then getting them to practice retrieving it is one of the best ways to achieve this.

We will compare starting a lesson with a retrieval quiz with starting with a discussion to engage interest.

But there are no studies (as far as we know) of the impact of school teachers’ own interpretations and implementations of retrieval practice (for example, where teachers create the retrieval quizzes and choose how they will be used). 

If we want to know whether it is a good thing to give teachers general advice like, Incorporate retrieval quizzes into your lessons”, then we need to know whether the ways that teachers interpret, adapt and implement that advice leads to effective instances of the intended practice.

We have chosen discussion starters as the comparator because it is a popular way to start a lesson: asking questions that engage students’ interest and get them thinking about what will be covered in the rest of the lesson. We want to specify the comparison, rather than compare with business as usual’, so that any difference in outcome is clearly interpretable. 

We have chosen to focus the trial on Year 8 science topics as a way of narrowing the range of variation within the trial, and Year 8 science teachers will add the starters to their normal lessons from mid-February until the end of May 2021.

We started this trial last year. However, as a result of COVID-19, we were only able to complete half of it. We are, therefore, recruiting to re-run the trial, and to make this happen, we need you!

We know that many Key Stage 3 science teachers will be keen to be involved in this Teacher Choices trial

To express your interest, register by completing this form . You will find further information here. If you have any questions, email teacherchoices@​nfer.​ac.​uk.

If you work in a group of schools, or an academy chain, you can also sign up together

Recruiting has begun and will close on 3rd December 2020. Please sign up and join us in creating rigorous and relevant evidence for practice.

References

[1] Coldwell, M., Greany, T., Higgins, S., Brown, C., Maxwell, B., Stiell, B., Stoll, L., Willis, B. and Burns, H., 2017. Evidence-informed teaching: an evaluation of progress in England. Research Report. Department for Education.

[2] Churches, R., Hall, R., & Higgins, S. (2017). The potential of teacher-led randomised controlled trials in education research. In A. Childs & I. Menter (eds) Mobilising Teacher Researchers (pp. 113 – 140). Routledge.