Research School Network: Approach With Caution: Research Evidence Red Flags If something appears too good to be true, it probably is.

Blog


Approach With Caution: Research Evidence Red Flags

If something appears too good to be true, it probably is.

by Shotton Hall Research School
on the

If something appears too good to be true, it probably is. In their recently published Using research evidence, the EEF warns us to be wary of educational claims. Whilst it is good to remain open to new ideas, we must examine claims critically rather than accepting them at face value.

Using research evidence is vital for high quality teaching and school improvement. It helps drive decision making that enables us to select the best bets for our pupils. It is crucial, however, that we engage with credible evidence, and that we trace claims back to the research they are based on. To be intelligent consumers of research evidence, we must interrogate it further.

Beware of claims

There are a multitude of claims about what works and what doesn’t in education, and there are large variations in the quality and reliability of research evidence. How can we tell which claims are reliable? The EEF has identified some red flag warning signs that can support us in asking the right questions:

C: Conclusions

Do the conclusions appear one-sided? Do they focus only on the findings that support the researcher’s view? Is the evidence used to support conclusions clear?

L: Limitations

What might the study’s limitations be and how might these have affected the results? How many pupils were included in the research and were they representative of pupils more broadly?

A: Applicability

How applicable are the findings to my setting? Can the research be reproduced in real-world’ classroom contexts? Are we interested in improving the same aspect as the measure used in the research?

I: Independence

Is the research independent? Does the author have biases or a personal investment in the outcomes? If so, have they made enough effort to mitigate these? Is the evidence being used for commercial gain?

M: Methods

Were the processes of data collection and analysis explained clearly? Do we know what the researchers did and where their findings came from? Were their methods appropriate for the questions posed?

S: Sample population

Was the sample large enough and does it represent the target population? Was the demographic limited? Were the research and control groups of equal size? Did any settings or pupils drop out of the research?

Interrogating the evidence around managing cognitive load.


The EEF’s Cognitive Science Approaches in the Classroom evidence review reveals various shortcomings in the credibility of research evidence on various interventions such as managing cognitive load, spacing, interleaving, retrieval practice, working with schema, etc.

Let’s take some of the evidence on managing cognitive load as an example. The EEF examined 22 school-based studies on using worked examples. There were limitations in evidence reliability as follows:

Most of the studies took place with secondary age students.
All studies focused only on maths and science.
Only 8 of the studies were based on the worked examples being delivered by the regular class teachers. The majority were delivered by researchers.
Similarly, the evidence review examined 16 school-based studies on the use of scaffolds to support cognitive load. Whilst the age-range of these studies was wider (816), the subjects were limited to maths, science, reading and history, and only 7 of the studies were based on lessons delivered by the regular class teacher.

Some of the red flags are evident here and therefore we should question the efficacy of these studies’ findings. The fact that the studies were limited in terms of age and subjects means they cannot be considered representative of the impact on pupils and subjects more broadly. The probability of the findings being applicable in different contexts is reduced. Whilst we might assume that the principles are transferable, we have no evidence of this. Also, studies carried out by trained researchers can never provide a true picture of real-world’ classroom contexts in which regular teachers implement these strategies.

With that said, the presence of red flags does not render studies useless; far from it. I am not suggesting for a moment that teachers should stop using worked examples or scaffolds. The EEF evidence review notes that the evidence around these strategies is promising. We just need to be cognisant that even something with a lot of promise won’t necessarily work in our specific context. We should prioritise understanding the theory that sits behind each of these classroom strategies and deliberately monitor the impact of these approaches in our contexts.

Research evidence does not supplant professional judgement; it complements it. Educators should use the best available evidence and sense check it against their own professional expertise. We must use robust evidence to consider what might work, then test the validity of that evidence in our own settings. We can build a rich evidence picture by considering studies from various sources and taking care not to cherry pick’ research that confirms our existing beliefs. Using information from systematic reviews and meta-analyses such as the EEF’s Teaching and Learning Toolkitand guidance reports can help us to ensure that our day-to-day practice is supported by a broad evidence base.

References

Using research evidence | EEF (educationendowmentfoundation.org.uk)
Cognitive science approaches in the classroom | EEF (educationendowmentfoundation.org.uk)
Teaching and Learning Toolkit | EEF (educationendowmentfoundation.org.uk)
Guidance reports | EEF (educationendowmentfoundation.org.uk)

Further reading

Educational – Key Concepts for thinking critically about educational claims (thatsaclaim.org)

More from the Shotton Hall Research School

Show all news

This website collects a number of cookies from its users for improving your overall experience of the site.Read more