Research School Network: Evaluating Metacognition In this post Chris Runeckles shares his evaluation of the implementation of metacognition at Durrington.


Evaluating Metacognition

In this post Chris Runeckles shares his evaluation of the implementation of metacognition at Durrington.

by Durrington Research School
on the

When thinking about implementation, I often remember the eloquent words of Professor Jonathan Sharples, who said: Implementation isn’t sexy.”

I tend to agree. It is an area of school leadership that is sometimes unappealing and that we are notoriously bad at, preferring to get excited by a new idea than methodically thinking through how we can make it real.

However, if implementation in general isn’t sexy then evaluation is its least attractive feature. It is complicated, long-winded and at the end of it you may well find that the implementation you have spent hours creating and convincing people to adopt is either not working or not happening.

Despite this, building evaluation into our work is essential if we are to know what works and what doesn’t and whether we are having a positive effect on the areas of school life that we are focused on improving.

With this in mind, I set out last summer to evaluate how effectively the principles of metacognition had been implemented at Durrington. As mentioned in a blog a couple of weeks ago, metacognition has been on our radar for some time. However, from last September it became a teaching and learning priority. My aim for last year was to develop a shared understanding of metacognition and for it to start to change teacher and student behaviour.

My starting point for the evaluation was what I expect is a rather dusty and often ignored corner of the EEF Metacognition guidance report on page 27. Here the guidance report gives sage advice on how to assess the impact of self-regulation and metacognitive interventions. Using this guidance I decided on the following three methods of evaluation.

  1. Trace observations of students.
  2. Questionnaires for curriculum leaders (completed in September and then again in June)
  3. Questionnaires for those classroom teachers who had selected metacognition as their appraisal focus for the year.

Trace observations are where you look for observable metacognitive strategies used by pupils, such as underlining a passage or making notes on how to plan an activity. I felt this was still open to substantial bias, and may provide rather wooly results. Therefore, I added further structure to it by using the four levels of metacognitive learners” proposed by Perkins (1992). The basis of these is set our below:

  1. Tacit learners are unaware of their metacognitive knowledge. They do not think about any particular strategies for learning and merely accept if they know something or not. 
  2. Aware learners know about some of the kinds of thinking that they do such as generating ideas, finding evidence etc. However, thinking is not necessarily deliberate or planned. 
  3. Strategic learners organise their thinking by using problem-solving, grouping and classifying, evidence-seeking and decision-making etc. They know and apply the strategies that help them learn. 
  4. Reflective learners are not only strategic about their thinking but they also reflect upon their learning while it is happening, considering the success or not of any strategies they are using and then revising them as appropriate.

I like these definitions and I felt they gave me something concrete to look for as I circulated around the school. Below is a summary of what I found:

Traits of tacit learners observed:
• In science, Y9 students had not reflected on the information they had written down or its wider significance.
• In business, Y10 students were not able to explain any of the strategies connected to the task they were performing.

Traits of aware learners observed:
• In maths, Y7 students were applying maths strategies to real world” problems. The students managed to solve the problem but showed limited awareness of the best strategies to use.
• In D&T, a Y9 student could identify that they always over-complicate their designs, which leads to them not finishing. However, they had not been able to act on this or change behaviour.
• In English, Y7 students were able to describe a PEE paragraph but not explain its purpose.
• In science, Y9 students asked elaborative questions showing a desire to think more deeply, however these were not particularly strategic.

Traits of strategic learners observed:
• In geography, Y7 students were able to make a reasoned comparison of the different methods to measure height on a map and explain why one was better than another.
• In history, Y7 students were able to evaluate what had gone well and badly in an assessment and give a description of what they would do differently next time (lack of deep reflection on why).
• In maths, Y7 students were selecting a particular strategy to solve a problem without prompting (lowest common multiple). They knew the strategy to use but not why they were using it.
• In French, Y7 students were able to explain their strategies for translation. They said they would first sound it in their heads to see it was similar to an English word, then use a dictionary and then use either their books, a partner or the teacher. They were not able to explain why one might be better than another.

Traits of reflective learners observed:
• In D&T, Y9 students were able to verbalise a strategy from a different project earlier in the year (drawing a safety line) that they had applied to their current project. They could explain the value of the strategy.

To summarise this, what I found was that while all types of learners exist, we are perhaps at the stage where we have a lot of strategic learners, but as yet have not yet broken through the panacea of creating truly reflective learners. This then helped me consider how to refocus my work this year.

These conclusions were supported by the questionnaires I completed with those staff who had chosen metacognition as their teaching and learning focus for the year. The summary of what the questionnaires revealed was:

  • Even those staff who had chosen metacognition as a focus this year did not feel completely comfortable in their understanding of it.
  • Having tried it all staff intended to keep going with it.
  • SPDS and magpie observations would be popular CPD choices to develop metacognition further.
  • There are some tentative indications that metacognition when applied to specific concepts or tasks by our staff can have a positive effect.

Again, the evaluation gave me focus. It suggested that metacognition was taking hold and was popular, but the shared understanding I was seeking was still some way off.

Finally, the curriculum leader questionnaires were my best way of judging progress through the year. I had asked them to complete the questionnaire in September, giving me a baseline of where they saw their staff and students in relation to key metacognition knowledge and processes. I then conducted the same questionnaire with the same people in June, and compared the difference. The summary of what found was:

  • Evidence that teacher understanding was improving, although there was work still to do in terms of getting full fidelity to the principles of metacognition.
  • Our students were not at the reflective” level yet.
  • Student understanding of the principles of metacognition was still relatively weak.
  • Modelling of thinking has been an area of improvement this year.

As with the other evaluative tools, this narrowed my focus as well as chiming with the other aspects of the evaluation. This then allowed me to create the following list of actions for this year:

  • To write a metacognition implementation plan.
  • To share key points from the evaluation with staff at INSET in September.
  • Working with departments through SPDS would be a primary lever for developing teaching knowledge, understanding and strategies.
  • To lead a metacognition inquiry question group at INSET.
  • To develop a shared language around metacognition that students are included in.

Clearly I am not naive enough to believe that my evaluation is anything like water-tight or that my own biases have not crept into it. We are ultimately teachers, not university researchers, so we have to, in my opinion, accept the limitations of what we can do with evaluation. However, what I certainly did feel having completed the process was far more informed and far clearer on what to do next. Maybe not sexy, but definitely reassuring.

More from the Durrington Research School

Show all news

This website collects a number of cookies from its users for improving your overall experience of the site.Read more