Building evidence infrastructure is a global good

Jonathan Kay introduces a collaboration with eBASE and Durham, funded by the Centre of Excellece for Development Impact and Learning, that hopes to hopes to contribute to a "global evidence infrastructure"

Two facts are widely accepted across educational research:

  • Systematic reviews are the one of the most rigorous methods for understanding the evidence for an approach
  • Systematic reviews are expensive and take a really long time

There are reasons systematic reviews commonly sit atop hierarchies of evidence. They aim to search and summarise literature exhaustively and through a process which is pre-specified and free from bias. Researchers don’t get to just summarise the first few hits on Google Scholar or the well-known highly cited articles in the field. The review goes through a process which is pre-specified, removing the risk that a researcher biases the results through personal preference. When we develop a new a topic in the Teaching and Learning Toolkit, we now:

  1. Create a protocol that documents the process to follow
  2. Conduct searches of academic databases pre-specified in the protocol
  3. Screen all of the studies to check whether they meet the pre-specified inclusion criteria (for example, does the study measure pupil outcomes?)
  4. Extract information from the included studies – not just effect sizes, but other aspects like the context in which a study takes place
  5. Synthesise the data
  6. Write up a plain language summary of the findings.

As you can imagine, this process takes time. Time that simply does not exist when you need to respond quickly to a crisis or policy question. It is this lack of time that have encouraged researchers to turn to new solutions for providing rigorous evidence of summaries quickly.

In response to covid-19, the Cochrane Collaboration, one of the trailblazers of the systematic review methodology published their guidance for conducting rapid evidence reviews. At the EEF, we have also been conducting rapid evidence assessments. For example, our publication on remote learning aimed to be quick but systematic by conducting a pre-specified search of existing systematic reviews on remote learning. (For context on how even a rapid evidence assessment is not free from burden, our search results for this rapid review yielded almost 4000 studies that needed to be reviewed before we could summarise the 60 eligible for inclusion). Other ways of making the process quicker include using machine learning to automate parts of the process and using crowd sourcing for part of reviews (another innovation led by Cochrane).

Another solution to this problem of rigour vs speed is to build “evidence infrastructure”. By this we mean, creating systems that allow us to quickly get access to and synthesis evidence. Rather than conducting new searches, screening and data extraction, what if we already had information about education evaluations, ready for analysis and communication? We are still at the stage of education evidence building where there are a finite number of evaluations that end up summarised across numerous different reviews.

This is the vision behind the EEF education database. Our ambition is that one day information about context, methodology and impact will be readily available from every education evaluation in the world. When a reviewer needs to respond quickly to a policy question, they can quickly begin the fifth step in the process - analysing the findings and presenting answers to teachers.

We are already beginning to build this infrastructure internally at the EEF. At the last count we have 2500 studies added to the database. Whenever we commission a new systematic review, we now commit to consistent data extraction that allows data from new studies to be added to the database.

What we really want is for this infrastructure to become a global good for education. Sharing data with any researcher or organisation that is interested in synthesis and growing the database through collaboration.

It is this desire to make access to evidence a global good that makes us so excited to announce our collaborative project with Effective Basic Services Africa (eBASE). With funding from the CEDIL programe (Centre of Excellence for Development Impact and Learning) we are able to add impact evaluations of programmes from low and middle-income countries to the database.

We hope that this is the first of many collaborations on building better evidence infrastructure. Having sharable data on all education evaluations will provide many possibilities – from understanding variation of impact based on age or subject; to beginning to empirically examine the transferability of education practices between different populations.

We want data from evaluations to be a global good. This is an open invitation to organisations interested in evidence synthesis to join us in building the infrastructure to make it happen.