Hampshire Hundreds

The Hampshire Hundreds project was a local authority led intervention which brought together lead teachers from Hampshire primary schools to provide them with evidence and support for effective teaching strategies to decrease the attainment gap between disadvantaged pupils and their peers.

The intervention consisted of support to teachers to enable them to better understand the learning needs of their pupils, and to consider how to improve the quality of their teaching, in particular, their questioning and feedback.  

Key Conclusions

The following conclusions summarise the project outcome

  1. Hampshire Hundreds showed no significant impact on raising attainment for disadvantaged pupils during the evaluation period.

  2. Schools were encouraged to respond to the training and support in a way that was most suitable for the context of the school.

  3. This project illustrates that it is difficult to convert research evidence into effective action within schools. It also illustrates the importance of careful piloting of an intervention before attempting an impact evaluation.

What is the impact?

The intervention showed an effect size on the disadvantaged pupils of 0.03. Similar effect sizes were found for children eligible for free school meals. This effect size is very small. However, the confidence intervals are very wide. As a result we are unable to say whether or not this intervention had an impact on months’ progress. Additional analysis was also conducted using the Key Stage 2 assessments in Year 6. This also shows an indeterminate effect. Therefore it is not clear that the Hampshire Hundreds approach is an effective way of improving attainment outcomes for children. 

GROUPNO. OF PUPILS EFFECT SIZE (95% CONFIDENCE INTERVAL)ESTIMATED MONTHS’ PROGRESSIS THIS FINDING STATISTICALLY SIGNIFICANT?EVIDENCE STRENGTHCOST
Disadvantaged pupils 924 0.03 [-0.09, 0.14]+1No
Other pupils1124-0.02 [-0.10, 0.08]-1No
Free school meal pupils4360.03 [-0.12, 0.17]+1NoN/A

How secure is the finding?

The evaluation was set up as a randomised control trial to test the impact of the Hampshire Hundreds intervention in comparison to a ‘business as usual’ control group, with the local authority leading the training and overseeing the provision of the intervention. As the intervention was run by the local authority it is classified as an efficacy trial. Efficacy trials seek to test evaluations in the best possible conditions to see if they hold promise. They do not indicate the extent to which the intervention will be effective in all schools since the participating schools are selected from one area, and the programme is delivered by the developers.

Analysis was completed on an ‘intention to treat’ basis where schools were compared in the groups to which they were originally randomly assigned.

The primary outcome measure was disadvantaged pupils’ progress in reading and maths (combined), as measured by InCAS (Interactive Computerised Assessment System) developed by the Centre for Evaluation and Monitoring (CEM) at Durham University.

37 schools were recruited into the trial. However, a high level of attrition occurred between the first and second round of testing, and resulted in only 14 treatment schools and 10 control schools completing the post-intervention testing. This significantly reduced the power of the trial. Both intervention and control schools dropped out (5 and 7 schools respectively). However, this does not appear to have led to bias in the experiment in that there did not appear to be any systematic differences between the schools that dropped out and those that remained in the treatment (see ‘Pupil Characteristics’ section on page 17).

The evaluation protocol estimated that it would only be possible to detect large effect sizes in a trial of this size. The extent of attrition made finding a statistically significant effect less likely. However, when we use administrative data to look at outcomes (where attrition is not an issue because we look at Key Stage 2 results), we also find results that are consistent with this analysis. This analysis is additional to that outlined in the evaluation protocol and the results are described below.

The intervention was conceived as an action research project in which participating schools would be free to shape their own delivery. The process evaluation suggested there was a mismatch between the action research approach and the evaluation design (as a randomised controlled trial) and these are discussed further in the concluding section of this report. 

To view the project's evaluation protocol click here.

How much does it cost?

If not supported by the EEF, the school would have needed to pay for LA support time at a cost of £600 per day. An estimate suggests a time of 3.5 days per school (see ’Cost’ section on page 26 for further detail) totalling £2100 for consultancy time per school (3.5 days per school @ £600 = £2100 per school).

The schools did not have to buy any additional resources but they did have to support the project work by additional time in school for staff to arrange testing and consider lesson observation evidence, to co-plan and collaborate on classroom application of resources, and to respond to data requests and other communications from the project management team. The amount of time each school allocated to this varied. A rough estimate would be a minimum of 5 days per school additional internal time. If we use the LA day rate for teacher supply, this is £190 per day or a minimum of £950 per school.