Research School Network: EEF Evaluation: Communicating and Engaging with Research


EEF Evaluation: Communicating and Engaging with Research

by Durrington Research School
on the

The Trials

Recently, the EEF published its findings from two Literacy Octopus’ trials, so called after their multi-armed design. Both trials focused on supporting teaching and learning of literacy at KS2, and evaluated the impact of communicating research to schools. The effects were measured through pupil outcomes as well as teachers’ use of research evidence to change their practice. Over 13,000 schools took part in the randomised control trials, with 29% FSM students (in line with the national average for primary schools). The large trial size, and therefore representation from many types of primary school, contributed to the five padlock security rating of the trial – meaning that the findings are considered very robust.

In the first trial 823 schools were sent hard copies of evidence-based resources in a range of formats, for example booklets, magazines, emails, webinars and online databases. The second trial involved 12,500 schools and tested whether providing resources along with light-touch’ support on how to implement the research findings would have greater impact. The light-touch’ support included, for example, invitations to attend twilight sessions looking at how to apply the resources in the classroom. Findings from both trials indicate that neither approach had any impact on student outcomes for literacy at KS2, nor led to an increase in teachers’ use of research evidence in their practice.

Research: What is it good for?

Although the findings from this large-scale evaluation seem counterproductive (and even, at first glance, injurious) in terms of advocating the gains to be made from sharing research evidence with schools, it does in fact bring to the fore some rich areas for thought and further development that can strengthen the research model. In particular, for research schools and their network colleagues the evaluation findings point to a clear need to investigate research mobilisation and the specific actions that are required to make this intervention effective in schools.

As a starting point, the findings from this evaluation add the rigour of empirical evidence to the already fairly widespread recognition that simply presenting research evidence to schools is not enough to improve outcomes. Indeed, what this trial has made more apparent is that research findings have to be somehow transformed into practice, and it is this crucial transformation from information on a page to behaviour in the classroom that has, so far, been a somewhat fuzzy area. Furthermore, the findings from the second trial indicate that any successful transformative process is likely to be more complex than the light-touch’ supportive approach that was tested. So where to next?

Professor Jonathan Sharples references work carried out last year at the EPPI Centre at UCL’s Institue of Education in which a range of knowledge mobilisation’ interventions were reviewed. Significantly, as well as investigating the mechanisms that underpin’ the research-based interventions, this review also identifies three behavioural requirements for any kind of research-based intervention to have an impact in schools:

  1. Opportunities to engage with the evidence-based interventions.
  2. Motivation to engage with the evidence-based interventions.
  3. Acquisition of the skills and capabilities to apply the evidence-based interventions.

It seems, therefore, that any decision to engage with research evidence in schools should comprise strategies for attending to these behavioural needs as well as steps to communicate the findings and suggestions for practice. Consequently, it could be that the failure of the second light-touch’ trial to have any impact may be because the behaviour needs are multiple and interconnected on multiple levels. Accordingly, a single-method approach, for example attending a one-off training day where the evidence is explained in isolation from school’s contexts, will not suffice for effective change.

Next Steps to Consider

Conventional means of communicating research evidence needs to be just one initial strand in a comprehensive approach to evidence mobilisation in schools. As well as sharing the findings from research, this EEF review indicates that any evidence-based interventions also have to incorporate a level of planning that supports and encourages the three behavioural needs outlined above. The key questions below could be useful starting points for reflection on how to bring this about:

  • There is a growing body of evidence that suggests the three required behaviours are more likely to be effectively implemented and sustained if they emerge through in-school approaches, for example coaching, mentoring, collaboration and further enquiry. What structures and processes do you have in place that not only provide opportunities for teachers to engage with research evidence and apply this in the classroom, but also motivates teachers to do so? 
  • Conditions in schools need to support evidence-based change. How does your CPD provision ensure that teachers have the skills and support to be able to apply the evidence to their domains and change their classroom practice? 
  • For evidence-informed interventions to be effective they need to be embedded and integrated in all school structures. Does research evidence underpin decision making at every level in your school so that all processes support evidence-informed improvement?

Fran Haynes

More from the Durrington Research School

Show all news

This website collects a number of cookies from its users for improving your overall experience of the site.Read more