Mathematics Mastery: Secondary
The Mathematics Mastery programme is a whole-school approach to teaching mathematics that aims to raise attainment for all pupils and close the attainment gap between pupils from low income families and their peers. The programme aims to deepen pupils’ conceptual understanding of key mathematical concepts. Compared to traditional curricula, fewer topics are covered in more depth, and greater emphasis is placed on problem solving and on encouraging mathematical thinking.
This evaluation assessed the impact of Mathematics Mastery on pupils in Year 7, after the programme had been implemented in schools for one year. It was intended that schools would also begin to use the programme in Year 8 in the second year of implementation, and continue until the approach was in place across the school. 44 schools from London and the South East participated in the trial, with a total sample of 5,938 pupils. Participating schools received training and resources to support the adoption of the programme, which was delivered by the education charity Ark.
The project was one of two evaluations of Mathematics Mastery funded by the Education Endowment Foundation (EEF). A second project assessed the impact of Mathematics Mastery on pupils in Year 1.
Testing an approach to teaching mathematics developed in Singapore.
The Institute of Education
Developing effective learners
Staff deployment & development
The following conclusions summarise the project outcome
On average, Year 7 pupils in schools adopting Mathematics Mastery made a small amount more progress than pupils in schools that did not. However, the effect detected was not statistically significant, meaning that it is not possible to rule out chance as an explanation.
There is no strong evidence that the approach had a greater impact on lower-attaining pupils than on higher-attaining pupils.
Combining the findings from this study and a second randomised controlled trial of Mathematics Mastery involving Year 1 pupils may strengthen the overall evidence for the approach.
Given the low per-pupil cost, Mathematics Mastery may represent a cost-effective change for schools to consider. However, teachers would need to resolve tensions related to differentiation to provide support for all groups of children.
It would be worthwhile to track the medium- and long-term impact of the approach, to assess whether there is a cumulative effect to the approach and whether it has an impact on performance in high-stakes tests.
What is the impact?
Overall, the findings from this evaluation are judged to be of moderate to high security. The evaluation was set up as an effectiveness trial, meaning that it aimed to test the programme under realistic conditions in a large number of schools. The evaluation used a randomised controlled trial design, with schools randomly allocated to adopt the programme or continue with ‘business as usual’. Randomisation reduced the likelihood that there were unobservable differences between schools in each group, and increased the security of the findings.
To help assess whether the improvement should be attributed to the programme, it is possible to combine the findings from this trial with other evaluations of Mathematics Mastery. This approach, known as a ‘meta-analysis’, can lead to a more accurate estimate of an intervention’s effect. However, it is also important to note the limitations of meta-analysis, and the care needed in interpreting findings based on studies that may vary in important ways. Combining the findings from this study and a second randomised controlled trial of Mathematics Mastery involving Year 1 pupils shows a statistically significant average impact of one additional month’s progress. This combined finding strengthens the evidence for the approach overall, and is discussed in further depth in a summary report on the EEF’s website.
Of the schools that initially enrolled in the trial, 88% of schools and 73% of pupils who initially enrolled in the trial were successfully followed through to completion. Participating schools volunteered to take part, so it is not possible to say whether similar effects would be seen in all schools. Participating schools had lower-achieving, lower-income and more ethnic minority pupils than the country as a whole. The extent to which the low-stakes test results used in this evaluation are predicative of high-stakes exams is difficult to assess at this stage.
|Group||No. of pupils (schools)||Effect size (95% confidence interval)||Estimated months’ progress||Evidence strength||Cost|
|All pupils vs. comparison||5,938 pupils (44 schools)||+0.06 (−0.04 to +0.15)||+1 month|
|FSM pupils vs. comparison||1,610 pupils (44 schools)||+0.07 (−0.04 to +0.17)||+1 month||N/A|
How secure is the finding?
On average, pupils in schools adopting Mathematics Mastery made more progress than similar pupils in schools that did not adopt the programme. The small positive effect can be estimated as equivalent to approximately one month’s additional progress. However, the effect was not statistically significant, meaning that it is not possible to determine that it did not occur by chance. A similar average impact was found for pupils eligible for free school meals.
There is no strong evidence that the approach had a greater impact on lower-attaining pupils than on higher-attaining pupils. Pupils made more progress in material that was focused on within the programme. Despite Mathematics Mastery not covering calculator use within its Year 7 syllabus, no negative side-effects were observed for this aspect of children’s maths skills. Possible explanations for the small average effects include the relatively little exposure children would have had to the programme and the fact that this was the first year the programme had been introduced. There were wide variations in how schools responded to the intervention. Nevertheless, there was evidence of a shift in most schools away from the teaching of procedures towards a problem-solving approach, involving increased use of discussion, objects, and diagrams.
Some tensions arose where teachers felt unsure about the impact of the intervention on examination results, in particular in relation to the seemingly reduced content coverage in each year. In addition, some teachers were unsure how to differentiate between different ability groups when tasks were open and sometimes did not have explicit learning objectives. In a follow-up study, GCSE results will be used to evaluate the long-term impact of the programme
How much does it cost?
The cost of the approach is estimated to be approximately £7,460 in the first year for a secondary school, including teacher training costs. The average ‘per pupil’ cost of the intervention is therefore around £50 per year, in the first year, with cost per pupil likely to reduce in future years.