Improving Numeracy and Literacy in Key Stage 1

The Improving Numeracy and Literacy project aimed to improve the numeracy and literacy abilities of pupils in Year 2 through two separate programmes of teacher training and accompanying teaching materials and computer games. The Mathematics and Reasoning programme aimed to develop children’s understanding of the logical principles underlying mathematics, and the Literacy and Morphemes programme aimed to improve spelling and reading comprehension by teaching children about sentence structure and morphemes. Morphemes are components of words that are either stems, which can often appear as words on their own (such as ‘fair’), or affixes, which cannot be words on their own (such as ‘un-’ or ‘-ly’). The programmes were originally developed (with the support of the ESRC-TLRP Research Programme) by Professor Terezinha Nunes and Professor Peter Bryant at the Department of Education, University of Oxford.

Both interventions were designed to last for 10 to 12 weeks with children receiving one hour of instruction per week as one of their normal literacy or numeracy lessons. Teachers in intervention schools attended a day of training aimed at introducing them to the programmes, explaining the concepts, and allowing them to explore the learning activities for themselves. This was followed by a visit from a member of the research team based at Oxford to support with programme implementation.

Fifty-five schools were recruited to participate in the evaluation by the University of Oxford team: 17 were allocated to the numeracy group, 19 to the literacy group and 19 to the control group. 

Key Conclusions

The following conclusions summarise the project outcome

  1. This evaluation provided evidence that the Mathematics and Reasoning programme had a positive impact on pupils’ numeracy ability equating to three additional months’ progress.

  2. There was no evidence to suggest that the Literacy and Morphemes programme had an impact on pupils’ literacy ability overall.

  3. There was an association between greater use of the accompanying computer games and greater impact in the numeracy intervention, suggesting the computer games were important to successful implementation.

  4. All teachers were able to implement the programmes, but most agreed there was too much content to deliver in one hour per week and so made various adaptations to their delivery of the programme. In future trials of the programmes, teachers should be permitted to use and integrate the materials in their own way, as they would in a normal teaching situation.

  5. A future trial could evaluate the programmes at scale in more than one location. When drawing up plans for bringing the programmes to scale, the Oxford team should consider whether training and ongoing technical support could be delivered remotely, rather than in person.

What is the impact?

The evaluation of the Mathematics and Reasoning programme provided evidence that it had a positive impact on pupils’ numeracy ability: pupils who received the programme made, on average, the equivalent of three months’ additional progress over the course of a year. Although pupils who received the Literacy and Morphemes programme made slightly less progress than the control group, this difference was too small to confidently conclude that it was caused by the intervention. The evaluation therefore provided no evidence that Literacy and Morphemes had an impact on literacy ability. In both evaluations, pupils eligible for free school meals (FSM) made slightly more progress if they participated in the programmes, but we are not able to conclude that these observed effects were caused by the programmes themselves rather than occurring by chance.

The teacher questionnaire indicated that teachers found the interventions straightforward to implement. All teachers were able to use the teaching units to deliver the numeracy or literacy content in weekly sessions.

This evaluation builds on previous work conducted by the Oxford University team that suggested that the programmes had evidence of promise. In these studies, the interventions were delivered either one to one or in small groups by researchers familiar with the intervention materials and the underlying theory. In addition, the measures used were closely aligned to the concepts being taught. The purpose of this trial was to evaluate the efficacy of teacher delivery in a whole-class situation and examine the impact on pupils’ overall performance in numeracy and literacy.

GroupEffect sizeEstimated months' progressEvidence strengthCost rating
Mathematics and Reasoning0.20+3 months
Literacy and Morphemes-0.05-1 month
Mathematics and Reasoning (FSM only)0.14+2 months
Literacy and Morphemes (FSM only)0.10+2 months

How secure is the finding?

The findings from this trial have high security. The evaluation was well designed and successfully implemented. The programmes were evaluated using a randomised controlled trial that compared the progress of pupils who received the programmes to a ‘business as usual’ control group. This evaluation was an efficacy trial. Efficacy trials aim to test whether an intervention can work under ideal conditions, with intensive support from the intervention‘s developer.

Schools were randomised into the three groups and informed of which group they were in after the baseline testing was completed. No schools dropped out of the study, but around 12% of pupils were excluded from the analysis because of missing test data. The proportion of missing pupils was similar in intervention and control groups, but bias may have been introduced if the intervention and control pupils dropped out for different reasons. Analysis of the results using predictions of the missing pupil data suggested that the missing data is unlikely to have affected the result.

The teaching resources were restricted to the teachers in the intervention schools, so there was no risk of the control group schools implementing elements of the intervention and ‘contaminating ‘ the trial. Testing to measure the outcomes was independently administered by NFER test administrators who were not told which group (literacy intervention, numeracy intervention, or control) the school had been allocated to, and tests were marked externally. The findings of the process evaluation are also secure: teacher questionnaires were distributed and collected by test administrators during the school visits and, as a result, the response rates were very high at 82%.

How much does it cost?

Over the single year of this evaluation, each intervention cost £21 per pupil (£520 per class). This figure includes the cost of training and the school visit, as well as costs associated with implementing the programme. Teachers required one day of supply cover to enable them to attend the training for each programme during term time: this may or may not have incurred additional cost depending on how it was dealt with by the school.

The estimated cost of implementing the programme over three years is £10 per pupil per year (£257 per class per year). Access to computers and/or tablets is a factor in successful implementation, which may require additional investment for some schools in the future.