This website collects a number of cookies from its users for improving your overall experience of the site.

Research School Network: Untangling the Literacy Octopus – three lessons from the latest EEF evaluation


Untangling the Literacy Octopus – three lessons from the latest EEF evaluation

Professor Jonathan Sharples, Senior Researcher at the EEF, looks at the key messages from our latest evaluation report investigating different ways of disseminating research evidence to schools.

On 1st December 2017 we published an evaluation of one of our most ambitious projects, the Literacy Octopus’. It provides plenty of food for thought for anyone interested in improving the way research evidence informs practice, not just in education, but across a range of sectors.

The Literacy Octopus projects are a pair of large, multi-armed – hence the name octopus’ – trials designed to evaluate different ways of engaging schools with a range of evidence-based resources and events. The common focus was supporting improved primary literacy teaching with the aim of better of pupil outcomes.

The first trial tested whether simply sending schools evidence-based resources in a range of formats could have an impact on literacy outcomes – this included printed research summaries, practice guides, webinars and an online database. The second trial tested whether combining these resources with additional support to engage with them would have greater impact.

In total, over 13,000 schools were involved in the two Literacy Octopus trials. Some schools were just sent evidence-based resources, while others received the resources along with additional light-touch support, such as invitations to twilight seminars on using the resources in the classroom. By testing different ways of engaging schools with the same evidence, the intention was to compare passive’ and active’ forms of research dissemination.

In what are some of the largest randomised controlled trials (RCTs) ever conducted in education, the evaluators, the National Foundation for Educational Research, found that none of the approaches had an impact on pupil attainment, nor on the likelihood of teachers to use research to inform their practice.

The findings of the dissemination’ trial, where schools were simply sent evidence-based resources, are perhaps not surprising. There have a been a few studies that suggest basic communications can have a modest impact on practitioner’s behaviours, suggesting this was worth investigating. Nevertheless, there has been a growing recognition over the last 20 years that simply packaging and posting’ research is unlikely, by itself, to impact significantly on practitioners’ decision-making and behaviours. In many ways, the findings today add further weight to this understanding, although in the form of much-needed empirical research.

What this shows, I think, is that our notion of research use’ needs to extend beyond just communicating evidence – for example publishing a report online – to looking at how it is effectively transformed and applied to practice. This message is particularly sobering, given that basic communication strategies still make up the majority of organisations’ efforts to mobilise research evidence, despite those organisations being aware of the limitations. This applies to all sectors, not just education.

So what about the second Literacy Octopus trial, the engagement’ one, which tested the impact of providing schools with some additional support to engage with the evidence-based resources, yet also failed to show an overall impact on teaching and learning?

A recent systematic review, published by my colleagues at the EPPI Centre at UCL’s Institute of Education last year, sheds some useful light on what might be going on. This review looked at six mechanisms that underpin a range of knowledge mobilisation’ interventions and how they impact on decision-making e.g. creating access to research, developing skills to use research. Importantly, in addition to reviewing different mechanisms to mobilise evidence, they also looked at the behavioural requirements that were necessary for those various approaches to have an impact. This included having:

  1. i) opportunities to engage with the interventions
  2. ii) the motivations to do so, and 
  3. iii) the skills and capabilities to understand and use the outputs.

Crucially, across all the different types of research use interventions they found that impacting on decision-making relied on also attending to these behavioural needs. For example, interventions that create access to research only appear to impact on decision-making if they are also combined with strategies that create opportunities and motivation for engaging with that evidence. Interventions that focus on building people’s skills to use evidence, for example through training and professional development, are conditional on also having the capabilities to act on it. Furthermore, it is often the use of multiple strategies – as opposed to single strategies – that influence decision-making, particularly where these approaches are embedded in existing structures and processes (e.g. school improvement or policy systems).

In light of these insights, the interventions in what we termed the active’ arms of the Literacy Octopus appear actually to be light touch. To what extent, for example, can attending a conference create the opportunities, motivations and skills to be able to do something with the evidence that was being presented? What further support, capacity and conditions are needed for that evidence to gain traction on classroom and school improvement?

A range of evaluations funded by the Education Endowment Foundation over the last few years illustrate a similar trend: that just exposing teachers to information about evidence-based practices is rarely sufficient in itself to improve teaching and learning, even if that information is underpinned by robust research. Projects such as Anglican Schools Effective Feedback, Ashford Research Champions, Teaching Effectiveness Enhancement Programme and Challenge the Gap all looked at high-impact strategies in our Teaching and Learning Toolkit, yet failed to see an impact on pupils’ learning outcomes. If we look at projects that do show promise, they often provide careful scaffolds and supports to help apply conceptual understandings to practical classroom behaviours and specific subject domains. Schools that did change their practice using the evidence that was presented, appeared to do so through structured in-school collaboration and enquiry.

Were we right to fund these projects? I think we were, for several reasons. Firstly, the activities in the Literacy Octopus trials are typical of the type of things that are going on in the UK relating to research use and knowledge mobilisation, so are worth evaluating. That is not to say that these activities aren’t useful – you can’t use research after all if you don’t know about it – although the evidence suggests they should be seen as a necessary, but not sufficient, condition to practical research use.

Secondly, as hinted at earlier, these evaluations add valuable evidence to our existing understanding of knowledge mobilisation. There is a striking paucity of robust, quantitative evaluations of interventions to support research use in schools, and these results extend and deepen the evidence base in this area.

More from the Doncaster Research School

Show all news