Evidence for the Frontline

Evidence for the Frontline (E4F) is an online brokerage service designed to provide teachers and school leaders with timely access to relevant evidence on supporting young people's learning. It was developed by Sandringham School and the Institute for Effective Education (IEE) at The University of York, with support from the Coalition for Evidence Based Education (CEBE).

Teachers and school leaders develop their own questions about improving teaching and learning and post them on the online platform. An online broker then matches the question to an academic researcher for a response; or signposts the teachers to relevant published evidence and answers already provided by the service. In some cases the broker supports the user to frame their question most effectively before passing it on. Examples of the research questions included: ‘Are there any studies that set out to explicitly explore the influence that inter-school collaboration has on student outcomes?’; and ‘Is there any evidence to suggest that students perform better if their Science teacher is teaching within their Science specialism up to Key Stage 4?’

After a development phase, involving representatives from 12 schools, the service was delivered as a pilot in 32 schools (14 primary, 16 secondary and 2 special schools) between September 2015 and July 2016. The aim of the evaluation was to establish: whether the intervention is feasible to deliver; whether it has promise in terms of changing teacher attitudes and behaviour; and whether the service is suitable for evaluation in a randomised controlled trial.

Key Conclusions

The following conclusions summarise the project outcome

  1. Demand was at the upper end of expectations: 192 users from the 32 schools (around 9% of teachers) posted a question over the year of the pilot. Sixty percent of teachers who responded to the survey indicated that they used the service to ask a question or to read the responses.

  2. The majority of users who responded to the survey had a positive experience of using the E4F service. They were satisfied with the quality of the answers provided and found the E4F website easy to use. The most common topics that users had evidence-based queries on included: pupil engagement and behaviour, developing independent thinking, differentiation, literacy, and feedback and monitoring pupil progress.

  3. Users considered that there were benefits to using the E4F service, particularly in terms of providing opportunities for research discussion; increasing their interest and enthusiasm for research evidence; improving their schools’ use of research evidence; and (although to a slightly lesser extent) helping to improve their practice in the classroom and pupils' learning.

  4. The pilot identified a number of potential improvements to the service, including faster responses for teachers, better promotion of the brokerage service, and facilitation of direct dialogue between the teachers and researchers.

  5. Although there are signs that E4F is starting to make a difference to research engagement, it is not suitable for a randomised controlled trial (RCT) measuring impact on pupil outcomes. The service is responsive to the questions teachers ask, which this pilot found to cover a wide range of subject areas, making pupil outcomes a challenge to measure using administrative data.

What is the impact?

Senior leaders and teachers were most positive in reporting that using E4F helped to improve their schools’ approach to using research evidence and that it gave them opportunities to discuss research evidence with others. In addition, although to a slightly lesser extent, they felt it helped to improve their teaching. However, a large minority (up to a fifth) did not feel there was evidence of benefits to pupils’ learning. Interviews with teachers indicated that one of the reasons for this response was that, at the time of the questionnaire, it was too early to say if the changes in classroom practice translated to pupil outcomes.

Users considered the E4F service had made a positive difference particularly to their interest in and enthusiasm for using research evidence; their awareness of new techniques and approaches to applying this in their teaching; and their understanding of new ideas, knowledge and information. In addition, although to a slightly lesser extent, they felt using E4F had made a difference to: their practice in the classroom; their pupils’ learning; and their schools’ policies and plans for using research evidence. It is worth noting that the teachers that responded to the survey may not be representative of the views of all the teachers in the 32 pilot schools. In terms of impact on teacher attitudes and behaviour, two standard ‘Research Use’ factor measures analysed in the baseline and follow-up surveys had mixed results, with low reliability. These were incorporated into this study to assess whether they would be suitable measures to use in any future evaluation of E4F. Research-use measures would require further development for use in a future trial of E4F.

The majority of users and academics who took part in the evaluation were positive about their experience of E4F. Users were satisfied with the quality of answers provided. The majority considered that the E4F website was easy to use and that using the service was an effective use of their time. For future implementation, providers need to focus on increasing the speed of providing answers, enhancing awareness of the brokerage role, and encouraging greater dialogue between teachers and researchers.

Although there are signs that E4F is starting to make a difference to research engagement, it would not be suitable for a randomised controlled trial (RCT) measuring its impact on pupil outcomes, even when it is more developed. There are two main reasons for this. First, it is an intervention which needs to be flexible and responsive in the way that it operates. This makes it difficult to design an effective RCT. Second, it would be difficult to define a measure of pupil-level impact to capture the effect of the intervention across primary and secondary schools and a wide range of subjects.

QuestionFindingComment
Is there evidence to support the theory of change?Yes – in terms of formative findingsThere was evidence which identified perceived benefits and positive outcomes from using E4F but it should be noted that a large minority (around a fifth) of respondents did not report evidence of pupil learning improvements. Interviews indicated that this was because it was too early to say whether pupil outcomes were improving.
Was the approach feasible?Yes as a pilot. Mixed going forwardsAlthough the majority of users and academics were positive about their experience of using the service, further development is needed to increase the speed of providing answers, enhance awareness of the brokerage role, and encourage greater dialogue between teachers and researchers.
Is the approach ready to be evaluated in a trial?NoE4F is not suitable for trial because it does not offer a defined outcome measure for pupils. It may be possible to use the Research Use factor measures as secondary outcomes, but results for the two measures tested in this study were mixed and work would be needed to develop Research Use measures before a future trial.

How secure is the finding?

This was a mixed-methods evaluation, designed to reflect the formative nature of a service at an early stage of development and implementation. A development phase involved the collaborative development of a Theory of Change and monitoring information (MI) tool, interviews with developers and discussions with developer schools, and questions in NFER’s Teacher Voice Panel survey to establish any wider demand for the service. The pilot phase involved baseline and follow-up surveys with staff in the 32 pilot phase schools, in-depth interviews in seven schools, interviews with research experts, and collection of MI and cost data. A range of schools took part, in terms of phase, geographical location and disadvantage as indicated by Free School Meals indicators. However, this was not a random sample of schools because all the schools involved demonstrated agreement and enthusiasm to take part in the pilot.

How much does it cost?

Costs during the pilot included: staff salaries for the ongoing leadership and operation of the service; website costs; and the costs of hosting and attending two workshops. The total cost of the whole pilot phase was around £95,000.

EEF commentary

Evidence for the Frontline is an online platform which connects teachers with relevant research. Teachers post questions and expert researchers publish responses. A research broker helps schools to frame questions and interpret responses.

The EEF wants research to have a positive impact in the classroom. We funded this project for two reasons. First, to support a service which responds directly to the evidence needs of teachers, and assess whether it has potential to promote evidence-based practice. And second, to understand whether it would be possible to test the impact of this service on pupils’ outcomes.

The pilot of Evidence for the Frontline suggests the approach has potential in terms of the first of these reasons. Although the evaluators identified possible improvements, including faster response times, teachers said the service stimulated discussion with colleagues and increased their interest in and enthusiasm for using research evidence. Some also felt it directly improved their teaching practice.

The evaluators concluded that the flexible nature of the service made it unsuitable for a randomised controlled trial to measure impact on pupils. However, the EEF is interested in developing the service and monitoring its effectiveness as part of our evidence dissemination work.