Blog -
Improving Feedback by Improving Subject Knowledge
Investing in Subject Knowledge has Multiple Benefits
Share on:
by Bradford Research School
on the
Being evidence-informed means going beyond the quick fix or the top tip. Using examples from the EEF, we show how a little digging beneath the surface can reveal those important nuances.
The EEF Toolkit
The Teaching and Learning Toolkit, and its counterpart the Early Years Toolkit, are heavily visited by teachers and school leaders. They are both incredibly useful, but there are pitfalls to avoid, and you will get much more from it if you scratch beneath the surface and read the full summaries. In our blog here, we set out 5 questions that can you can ask:
When dealing with a meta-analysis such as this, the overall rating can hide good and bad studies. Take ‘Setting or Streaming’ for example. The toolkit states that the impact of this is negative (-1 months progress). But in the summary is the following passage:
The evidence suggests that setting and streaming has a very small negative impact for low and mid-range attaining learners, and a very small positive impact for higher attaining pupils. There are exceptions to this pattern, with some research studies demonstrating benefits for all learners across the attainment range.
So the average is negative, but there are examples where the impact is positive. And you see this idea of variability in other summaries, such as that for Digital Technology:
Studies consistently find that digital technology is associated with moderate learning gains: on average, an additional four months’ progress. However, there is considerable variation in impact.
And if you want to explore this even further, the summary links to the studies that fed into the meta-analysis.
Similar nuances can be seen in evaluations of projects. It’s easy to skim the surface of an evaluation and conclude that it worked or it didn’t. But the reality is that everything is more complicated than that. It’s always worth reading further. For example, following a successful efficacy trial of Catch Up Literacy, where the EEF noted an effect size of +3 months, the scaled up version showed less promise. One of the conclusions of the evaluation report was that implementation was an important factor, a nuance that is lost without reading the report:
The intervention was not always delivered as intended. Some schools struggled to resource two one-to-one sessions per week, while in other schools TAs adapted how they delivered individual sessions from what they were taught in the training.
Beyond the Headlines
Interesting new research can often be published with an attention grabbing headline in the press. When the Improving Behaviour in Schools guidance report launched, one such headline was ‘Greeting pupils at the door improves behaviour’. When our research lead Luke Swift looked a little further into the original study (Cook et al, 2018), he saw that ‘positive greeting at the door was only one aspect of the intervention evaluated.’
Four of the researchers were kind enough to discuss the paper with him, and while the headline is not exactly false, it certainly hides the nuance. Andrew Thayer said:
A simple "hello" is not necessarily the greeting we are looking for. If you must, pair it with an open-ended question that CANNOT be answered with one word like "good" or "fine." Instead, I often recommend teachers do one of two things with their greeting: 1) implement behaviour specific praise ("You have been so quiet in line, thanks so much. I can tell today is going to be a good day."); or 2) a specific relationship question, like, "Hey did you manage to win some games in Fortnite last night?"
Following the Evidence
Sometimes a lot of the how and why can be lost by the time a message around evidence is communicated. One example of this that we encountered recently was in the Metacognition and Self-Regulated Learning guidance report. Recommendation 7 is that ‘Schools should support teachers to develop their knowledge of these approaches and expect them to be applied appropriately’, and within that, a bullet point suggests that ‘Teachers can use tools such as ‘traces’ and observation to assess pupils’ use of self-regulated learning skills.’
This reference to traces is merely the tip of an evidence iceberg, and to understand some of the nuance it’s worth exploring. This reference took me to the Metacognition and Self-Regulation: Evidence Review, where the following appeared:
...researchers have advocated the use of real-time rather than retrospective measures, collecting indicators of self-regulation as students are completing a particular task. Two main types are identified in Dent and Koenka’s (2015) review: traces and think aloud protocols. Traces are observable signs of cognitive strategies students use while completing a task, such as underlining a passage or making notes alongside a piece of text. These are not reliant on self-report, but have their own inherent biases and issues, such as the fact that it is not easy or even possible to establish metacognitive processes underlying these cognitive strategies, and that such strategies may themselves be used rather unthinkingly where students are taught or expected to do so by teachers.
Not only did this reveal some of the nuances and limitations of ‘traces’, but it pointed me to additional research (Dent and Koenka, 2015) which pointed me to additional research (Winne and Perry, 2000). The latter revealed much finer detail into the concept of traces, and broadened my understanding beyond that simple bullet point in the guidance.
We have used so many metaphors in this post: digging deep; scratching the surface; icebergs. Whatever the metaphor, look for the nuances.
References in this post:
Cook, C. R. et al. (2018) ‘Positive Greetings at the Door: Evaluation of a Low-Cost, High-Yield Proactive Classroom Management Strategy’, Journal of Positive Behavior Interventions, 20(3) pp149-159.
Dent, Amy L. and Alison C. Koenka. “The Relation Between Self-Regulated Learning and Academic Achievement Across Childhood and Adolescence: A Meta-Analysis.” Educational Psychology Review 28 (2016): 425 – 474.
Winne, Philip & Perry, Nancy. (2012). Measuring Self-Regulated Learning. Handbook of Self-regulation. 10.1016/B978-012109890 – 2/50045 – 7.
Blog -
Investing in Subject Knowledge has Multiple Benefits
Blog -
Success is an important factor in motivation – how do we reconcile that with desirable difficulty?
Blog -
Making it easy for students to study by teaching them how
This website collects a number of cookies from its users for improving your overall experience of the site.Read more