Research for schools

1 March 2017

Author: Steve Higgins

Professor Steve Higgins @profstig is the lead author of the Sutton Trust-EEF Teaching and Learning Toolkit. As a former primary school teacher, he has a particular interest in the use of research and evidence to improve learning in schools. In this blog post he explains how the Sutton Trust-EEF Teaching and Learning Toolkit offers a comparative summary of the evidence from education research, but needs thoughtful application if it is to be useful.

 

There is a wealth of evidence about the impact of a wide range of approaches to improving outcomes for children and young people in the education research literature. More than anyone could read in a lifetime. This presents a challenge in terms of finding and using research to improve practice in schools.

The Sutton Trust-EEF Teaching and Learning Toolkit aims to make this research more accessible and to provide summaries of the relative impact of different approaches to help decision-making in schools. It does not attempt to be comprehensive and to cover all areas of educational research. There is a wealth of evidence, for example, which describes how children learn and develop their capabilities in different subject areas. Nor does it focus on the experiences of teachers and learners, or the ways that teachers themselves learn and develop. These are important areas of research which can also help schools improve outcomes for learners.

toolkit

The focus of the Toolkit is on summaries of the evidence about different approaches to improve outcomes which have stronger causal inference. This means concentrating on research designs which compare effects between groups of learners, some who do and some who do not receive an intervention or approach. We use meta-analyses to compare effects across these different areas and try to make an estimate of the cost of different approaches as accurately as we can. This means we are not just interested in effectiveness, but also the efficiency of different approaches. This is a theme which I believe will become increasingly important over the next few years.

The Toolkit can only tell us what has been successful, on average, in other contexts, with other pupils, in other schools. This is “what’s worked”, rather than what will work somewhere else. It does however let us think about what makes a “good bet” from the research evidence in terms of what is likely to succeed (on average) somewhere new. This is where professional judgement becomes important. I argue that you will increase the chances of success if you choose an area of practice or of pupils’ current performance which you think needs improvement. The research you choose needs to be applicable and appropriate to a new context if it is to be likely to be successful. Think of this as using research to find solutions to current challenges. This risk of picking things at random from research is that you may not get much improvement, particularly if you are already doing relatively well. Of course, the research may also inform your decision-making about what to stop doing. Evidence about what has not worked, on average, may help you identify how to create space for something new.

The next challenge is that for each area in the Toolkit there is a spread of effects. The Toolkit summary just presents the average. Not everyone has succeeded with Feedback or Meta-cognition and Self-regulation. Just because the average indicates it is a good bet, you want to make sure you adopt or implement it in a way which increases the chances of success. Here we need to focus on what the learners will do differently. We don’t always understand the underlying causal mechanism of what brings about improvement in each of the Toolkit strands, but unless the learners work for longer, work harder, are more efficient in their learning activities or work more effectively then we won’t see better learning. This may also help you decide whether to choose or stick with something which, on average, other people have not succeeded with. If you can see that it is working for you (and have clear evidence for this in terms of outcomes for pupils) then it may be that you have an unusual or exceptional way of being successful where others typically struggle. The low average effects for Mentoring, for example, often surprise people. This suggests it is high risk, or perhaps a long odds bet, and suggests you may want to evaluate it carefully to be sure it is as effective as you think. We can easily be fooled by learners’ engagement in activities in schools and their interactions with their peers and teachers, but which does not increase the time they spend learning, or which does not engage them in working harder or more efficiently or more effectively in terms of successful learning.

Where the Toolkit struggles is in making the research actionable in the classroom. This is partly because we focus on the broader patterns in the research which, for example, indicate the importance of the quality of teaching and learning interactions as opposed to structural or organisational changes which tend to be less effective.

effects

Spread of Toolkit effects indicating the importance of teaching and learning interaction

It is also that in aggregating the effects across different studies in a meta-analysis it is often less clear what it is about a specific programme or approach which explains why it worked. Here it may be more useful to look at some of the other work of the EEF in terms of its promising projects or its school themes or campaigns and guidance reports.

Overall the Toolkit should be used to inform decision-making in schools. You should start by identifying areas of professional practice or student performance which you think need development. Then, identify areas of research which might help address these. Reflect on what it is that the learners will do differently as a result of the change which will improve their learning. Will they work for longer, harder, more efficiently or more effectively? Also decide what you will stop doing. How will you create the space to give the new approach a chance to succeed? What will it replace? It is not as though there is time to do more in school. We also need to use research to identify the things which are less effective.

model

 

Steve Higgins recently gave a keynote speech at the Kyra Research School launch conference.You can download the presentation slides from his talk here.

References/ further reading:

Sutton Trust-EEF Teaching and Learning Toolkit

Higgins S and Katsipataki M, (2016) Communicating comparative findings from meta-analysis in educational research: some examples and suggestions International Journal of Research & Method in Education 39(3) pp 237-254 http://dx.doi.org/10.1080/1743727X.2016.1166486

Higgins S, (2016) Meta-synthesis and comparative meta-analysis of education research findings: some risks and benefits Review of Education 4.1: 31–53. http://dx.doi.org/10.1002/rev3.3067

Posted on 1 March 2017
Posted in: Blog

Comments:

  1. Rachel Lofthouse
    March 23, 2017 at 6:39 pm
    Hi Steve Thank you for this – I am currently sitting with my Newcastle Uni Masters group (of teachers and educators) who are undertaking practitioner enquiry this year. We have used your blog post as a source for discussion and it has been particularly helpful to stimulate thinking. They are almost all familiar with the toolkit already, and aware of its role and potential to aid decision making. We talked about this blog in the context of research-informed vs evidence-based teaching by also readinghttps://ioelondonblog.wordpress.com/2017/03/23/just-what-is-evidence-based-teaching-or-research-informed-teaching-or-inquiry-led-teaching/#comment-9301 which was published today. I anticipate that it helped them immediately as they moved on to conducting triad interviews with each other regarding their plans for enquiry. They haven’t stopped talking yet – so we are doing something right. Rachel

Comments are closed.