EEF Marking Review: Responding to feedback

James Richardson and Robbie Coleman respond to feedback on the EEF's Marking Review

Following the publication of the EEF’s marking review, A Marked Improvement?, last month it has attracted a substantial amount of web traffic and media coverage and lots of discussion on social media. In addition, a number of bloggers, including David Didau and Ben White, have provided reflections and comments on how the report may (or may not) be useful in schools.

Given the prominence of the report and the topic (Readers who wish to view this post in green font may click here) it feels right to respond to some of that feedback.

Three reflections:

1. The key finding

We would be very happy if people took the current lack of evidence on marking as the key finding of the report. David Didau in his blog expresses a hope that this should be the main takeaway for readers, particularly if burdensome marking policies are being considered or reviewed. As far as we are aware, the EEF’s report is the first to state this finding so clearly, so far from being an “admission”, if this point is widely understood by both schools and researchers it feels like it would be an excellent outcome.

2. What to do when evidence is weak

There is a difference of opinions about what to do after identifying that overall the evidence base is weak. One option would have been to stop completely and say no more. Another option, which we took, was to identify some areas of promise where future research could be done, and provide some information for schools to consider when devising and developing marking policies in the meantime.

We think taking the former course would have been a mistake for two reasons. First, and perhaps most importantly, it would have wasted valuable time. Given the dearth of evidence on marking, there is collective agreement that more research is needed, and alongside the publication of the report the EEF announced that £2million has been ring-fenced for exactly this purpose. Identifying those areas where there is promise – even if the promise is based on evidence that is relatively weak or that comes from related fields – will help ensure that we get the research we need as quickly as possible.

Second, it would have underestimated the ability of schools to deal with imperfect information. The overall weakness of the evidence base is referred to prominently throughout the report. It would be a step backwards to conclude that readers are unable to deal with information about both the security of evidence and the possible promise or impact on learning related of a particular policy or aspect of marking. We firmly believe that schools considering reviewing their marking policy would find it useful to consider the early findings about mistakes vs. errors, the evidence from related fields about target-setting and the existing evidence about grading. Clearly, the findings are not strong enough to make decisive contributions or to undermine particular policies that schools have carefully considered and believe work well. But it does not follow from this that time spent engaging with the existing evidence-base is a waste of time.

3. Thanks for reading

If the researchers and schools are to have a productive relationship – and at the EEF we firmly believe that they can do – then communication between them must be open and honest. It’s extremely helpful to get feedback – positive and negative – on our work and we look forward to receiving applications for trialling different marking approaches from teachers when the next funding round opens in June. 

You can access our evidence on written marking, A Marked Improvement? here.