Online Reading Support

Abracadabra (ABRA) is a 20-week online literacy programme composed of phonic fluency and comprehension activities based around a series of age-appropriate texts. Four 15-minute sessions per week are delivered by a teaching assistant (TA) to groups of three to five pupils. This report summarises the findings of a randomised controlled trial assessing the impact of ABRA on literacy outcomes for Year 1 pupils. The trial also assesses the impact of an offline, paper and pencil version of the same intervention (referred to here at ‘the non-ICT intervention’). There were 51 participating schools and 2,241 pupils at randomisation, and a total of 48 schools and 1,884 pupils were included in the final analysis (84% of the initial pupils at randomisation).

The trial took place between October 2014 and May 2015. Fifty-one schools were randomly assigned to either receive some version of the intervention or to act as a ‘control’ school delivering business as usual. In the schools receiving the intervention, pupils were randomised to receive one of the following options: (1) ABRA, (2) the non-ICT intervention, and (3) standard literacy provision. The process evaluation involved observing sessions to understand a variety of factors in the intervention. These included an evaluation of which elements contributed to successful implementation, the perceptions and experiences of TAs and project leads, levels of pupil engagement, and the mechanisms behind the estimated impacts. This was an efficacy study, due to the involvement of the developer in the delivery of the programme. The study was funded by the Education Endowment Foundation and Nominet Trust as part of a funding round focusing on the use of digital technology to improve outcomes for disadvantaged children.

Key Conclusions

The following conclusions summarise the project outcome

  1. The children who received ABRA, or its offline alternative were found to make two and three months’ progress in literacy respectively compared to the children who received standard provision. This positive result would be unlikely to occur by chance.

  2. For both ABRA and the offline alternative, the impact for children eligible for free school meals and children with below average pre-test outcomes was larger than for all pupils.

  3. Successful implementation contributed to a well-designed and delivered training programme which emphasised fidelity and consistency, enforced by ongoing support from the project team.

  4. The process evaluation found that both the ICT and non-ICT interventions may be best delivered in groups of similar rather than mixed ability. The process evaluation also suggested minor changes to the intervention to make it more culturally relevant to British pupils, and to remove some repetition in the non-ICT programme.

  5. Future research will examine whether ABRA or the non-ICT intervention can be successfully delivered at scale, and will look at longer-term impacts through assessing Key Stage 1 data from this trial.

What is the impact?

Both the ICT and non-ICT treatments were found to have positive results for literacy that were unlikely to have occurred by chance, although the size of the non-ICT effect is considerably higher. The impact was higher for children eligible for free school meals (FSM) for both ABRA and the non-ICT intervention, with both groups making the equivalent of five months’ progress. Pupils with below median pre-test outcomes seemed to benefit from ABRA, whereas the non-ICT intervention seemed to benefit both below- and above-median pupils. Pupils that received normal literacy provision in the schools where the interventions took place did better than students in schools that only received normal literacy provision. This is consistent with the existence of spillover (or peer) effects. This preliminary evidence will be further investigated in future research when further data on teaching assistant surveys and log data becomes available.

The process evaluation indicated that the implementation of both interventions was successful and benefitted from a well-designed and delivered training programme which emphasised fidelity and consistency facilitated by ongoing support from the project team.

These positive findings for the ICT programme are in line with previous studies about ABRA which found that ABRA leads to improvements in literacy. This evaluation adds to this body of evidence through (1) looking at a larger number of students than previous studies, (2) comparing ABRA with a non-ICT treatment, and (3) looking at the results of students that did not receive ABRA but went to school with students that did allowing the study to see if there are benefits to students that do not receive the intervention directly. The comparison between ABRA and the non-ICT intervention, which used the same materials in a non-digital format, is particularly useful as the positive non-ICT results seem to support the idea that it is the literacy programme itself making the difference rather than the digital delivery format. There are several studies that find no evidence of ICT programs having a positive impact on pupil outcomes using strong research designs at a relatively large scale. In this broader literature, the findings of this study are more unusual. 

Intervention GroupNumber of SchoolsEffect Size (95% confidence) Estimated months’ progressSecurity ratingCost
ICT480.138 (0.004, 0.273)+2
ICT FSM480.368 (0.089, 0.646)+5N/A
Non-ICT480.231 (0.102, 0.360)+3
Non-ICT FSM480.396 (0.195, 0.596)+5N/A

How secure is the finding?

These findings have moderate to high security. The trial was a three-armed randomised controlled trial that featured randomisation between schools that received ABRA and the non-ICT intervention and schools that continued normal literacy provision. It also involved randomisation within the schools that received interventions, between ABRA, the non-ICT intervention, and pupils that received standard literacy provision. The trial was large and the pupils who received the intervention were similar to the pupils in the comparison group; 16% of the pupils were not included in the analysis because they did not complete all the tests at the end of the trial.

How much does it cost?

The average cost per pupil per year over three years is £8.52 for the ICT intervention and £8.49 for the non-ICT intervention. This cost includes training the teaching assistants, cover during training, and travel costs. All of the costs are frontloaded into the first year of the programmes (which costs £25.56 and £25.47 per pupil respectively). The programme is free to deliver in the following years.

EEF commentary 

The EEF tested ABRA online reading support, a small group literacy intervention that is administered by specially trained teaching assistants and takes a balanced approach to teaching literacy involving both phonics and reading comprehension. We funded the project because it takes an evidence-based approach to literacy, and because the ABRA software is free, meaning that ongoing costs to schools would be low after the initial training for Teaching Assistants.

Our evaluation tested the online intervention, alongside a paper-based alternative using the same material. Positive effects were found for both, equivalent to two to three months of additional progress, with a larger impact for students eligible for free school meals. The findings are consistent with the evidence from the Teaching and Learning Toolkit, which indicates that technology is most effective when used to facilitate new approaches to teaching and learning, rather than as an end in itself, and with our Key Stage 1 Literacy Guidance, which recommends a balanced approach to teaching reading.

The EEF is likely to fund a further trial of ABRA to test its impact when delivered at scale. Many schools already purchase commercial reading programmes and may wish to consider the ABRA training and free software as an alternative with promising evidence.