Imagine (or maybe you don’t have to imagine!) that you’ve designed and/or are delivering an educational programme. Your programme has been proven to have impact, and now you’re aiming to scale up delivery to reach even more education settings, practitioners, and children and young people.
But you’re also faced with new challenges:
- How do you recruit more trainers who are able to train practitioners to deliver the programme?
- How do you quality assure the training that practitioners receive from those trainers?
- And how do you ensure practitioners are delivering the programme the way they are meant to?
These questions were central to the capability building support that our team at Oxford MeasurEd has provided to EEF-funded projects over the past year.
Today, the EEF have published two guidance documents based on the learnings from our project: one on train-the-trainer models and one on quality assurance (QA) at Scale. These are designed to provide practical guidance, rooted in evidence, to help organisations ensure that their programmes remain impactful as they scale. They even include “cheat sheets” with our top tips summarised in one page!
Because QA can mean a lot of different things to different people, we want to unpack how we have defined QA in this project.
Why quality assurance?
First things first: why does QA matter? For your programme to have impact, it needs to be delivered with fidelity. And you can’t hope for the best and assume your trainers and practitioners will know what to do, however knowledgeable and talented they might be. Instead, you’ll need to establish QA systems and processes to ensure your programme is delivered as intended.
What is quality assurance?
We understand QA to include three core elements:
The first element is about laying the ground for success. For instance, you should take steps to win the hearts and minds of those involved in delivering the programme. When implementers believe in a programme, the chances of success increase. You might want to provide opportunities to see the programme working in practice, for instance through opportunities to visit “ambassador” schools and settings. Or you might want to share your own evidence that the programme works (i.e. evidence from robust EEF evaluations) to demonstrate success and impact to your stakeholders. There is no magic bullet, but it is crucial to remember that your programme’s success really does depend on how motivated those delivering the programme are.
Once you’ve laid the ground for success, the next step is to monitor implementation across the key stages of the programme. As the team or organisation most familiar with the programme, you will first need to define its core components and what “fidelity” and “quality” look like. Then you’ll need to decide which of these criteria to monitor, and how. It can become difficult to collect everything you want to, and it’s important for your monitoring to be proportionate and purposeful. We recommend prioritising collecting data on the components of the programme you are least confident will be delivered as intended.
Finally, you’ll need to be able to respond to what you find out from the data you collect. Things don’t always go according to plan, making it important to be prepared to act before it’s too late. The are multiple benefits of having “feedback loops”; for instance, your data can help you identify trainers, settings or practitioners who need additional support. Or they might show that your existing training offer needs to be revised, as it is not building the skills and capabilities practitioners need to deliver the programme.
One last thought…
Before you go and have a look at the guidance notes yourself, one final thought: it really is worth spending time explaining to trainers and practitioners why QA matters.
Understandably, people can be concerned about being “observed”, “watched” or even “judged”. So, we really encourage you to make a compelling case that QA is about working together to strengthen delivery and ensure your programme has an impact, rather than questioning or undermining practitioners’ expertise and autonomy. It will be worth it.
Access the resources
Projects