Professor Becky Francis, the EEF’s chief executive, writes:
Our evidence generation is driven by a moral purpose. In my first letter, as I took up leadership of the EEF, I described the EEF’s as a story of two gaps: closing the evidence gap to help close the disadvantage gap. But there is a third gap I want us to help close: between relevance and rigour. Practitioners and policy-makers – understandably – want answers right now to help tackle the challenges they face. Waiting for high-quality evidence can sometimes seem like a researcher’s luxury. Our aim at the EEF must be to ensure we are addressing the questions of greatest importance to teachers and senior leaders in the most robust and timely way possible. Two of our newest initiatives highlight how we are doing this
Our Teacher Choices trials are exploring some of the most common questions teachers ask about their practice, testing the everyday choices made when planning their lessons and supporting their students. For instance, what’s the most effective way for secondary science teachers to tap into pupils’ prior knowledge and activate working memory at the start a lesson – with a discussion or a quiz? Or what’s the best way for primary school teachers to read aloud with their class to promote comprehension and enjoyment? We’re trialling both right now. Two things particularly excite me about this programme of work. First, we’re actively involving teachers in identifying the research questions that matter most to them. Secondly, the trials over-recruited before their deadlines, demonstrating the enthusiasm of the teaching profession for generating evidence they can put to good use.
Our School Choices programme, meanwhile, focuses on supporting senior leaders to make evidence-informed decisions on some of the larger scale practical issues they face (and which cannot straightforwardly be tested through RCTs of interventions). For instance, we’re comparing outcomes for primary pupils in disadvantaged schools taught by NQT-level teachers trained in three different routes. And this week we’ve announced we’ll be exploring what works most effectively at Key Stage 4 – two or three years of study? We’ll be looking at both GCSE outcomes and the breadth of curriculum offered by schools. Everyone, it seems, has an opinion. Our role is to provide evidence to support the debate. These initiatives are, of course, in addition to our regular grant-making to evaluate the impact of high-potential projects aiming to close the disadvantage gap. Our latest batch, just announced, includes three focusing on improving maths, from the early years through primary to GCSE. We have also this week published a further three independent evaluations of EEF-funded projects, bringing the total of completed projects to 127. As I noted in an article this month for TES: … Evidence must help to democratise education. The studies the EEF publishes are owned by the profession. Our Toolkit represents a knowledge base built by literally millions of teachers and students around the world. Evidence does not provide easy solutions, but evidence-informed improvement is a process that has integrity and holds greater promise than any alternative.
Children and young people with social care experience
There is another way in which the EEF is supporting both relevance and rigour. This week our colleagues at the What Works Centre for Children’s Social Care published What Works in Education for Children who have had Social Workers. This is based on their re-analysis of 63 EEF trials, exploring which interventions appear to have larger positive impacts for children and young people who have had a social worker than for their peers. The EEF’s work has, so far, involved over 14,000 schools, nurseries and colleges and reached over 1.6 million children and young people. As my colleague Stephen Fraser notes here – What can 63 EEF trials tell us about closing the attainment gap for young people with social care experience? – we always hoped the archive of our trials’ data would grow to become a public good in its own right, used by other organisations with a shared interest in improving young people’s outcomes to support their own aims. This is the first time one What Works Centre has systematically re-analysed the data from another. It’s a significant achievement for smart partnership working – and for relevant and rigorous public policy.