Research School Network: Breaking Down Complex Writing Tasks Part 4: Monitoring Mark Miller on approaches to help pupils monitor their writing


Breaking Down Complex Writing Tasks Part 4: Monitoring

Mark Miller on approaches to help pupils monitor their writing

by Bradford Research School
on the

Recommendation 4 of the EEF’s Improving Literacy in Secondary Schools is Break down complex writing tasks,’ and it acknowledges that even seemingly simple writing tasks can be more complex than we at first imagine. Because writing is central to so many subjects, we must try to understand just what makes writing so hard. Over a series of blogs, we are exploring how to address each of these elements. This time we look at how we can help students monitor their writing.

Catch up on our previous blogs in the series:

Breaking Down Complex Writing Tasks Part 1: Spelling
Breaking Down Complex Writing Tasks Part 2: Sentence Construction

Breaking Down Complex Writing Tasks Part 3: Planning

Checklists
One simple way of doing this, according to the guidance, is providing a checklist of features included in high quality answers’.

I don’t think anyone who teaches writing will find the idea of checklists particularly groundbreaking. But the word providing’ is doing a lot of heavy lifting in the sentence above. For some pupils it may well be enough to provide a checklist or similar to help them to monitor their writing. 

It comes down to working memory capacity. Checklists can support the act of cognitive offloading’, the freeing up of working memory, in this case to focus on other aspects of the complexity of writing. Rather than having to refresh the checklist in their head and thus take up cognitive capacity, they can refer back to it. However, applying a checklist may actual decrease writing fluency, as the act of going back and forth between checklists and writing can interrupt, and take up vital cognitive resources.

If we want these to be used as effective monitoring tools by those who may not find their use as instinctive, we can model their use. Just as we would model the act of writing itself, so too should we model the use of a tool that can support the writing process.

Explicit modelling of monitoring

It’s not just the use of tools that we should model – we should model our thought processes. We’ve written about this recently, in our What to Reveal when Modelling post, but we’re happy to repeat it again. We need to make the implicit explicit. And monitoring our progress is one of those hidden things that not everyone sees, and if they don’t see it they don’t even know it is going on and certainly can’t do it themselves.

When modelling, we can simple ask: How well am I doing?’ This question can change depending on the context.

  • How well am I doing compared to my last attempt?
  • How well am I doing compared to my original plan?
  • How well am I doing compared to my general writing targets?
  • How well am I doing in relation to audience, form and purpose?
  • How well am I doing in relation to the marking scheme?

Modelling can also help us to see that sometimes writing will go wrong, so if the answer is I’m not doing well’, then we can model the process of responding to this. One thing that can be made very explicit is the link between the planning or goal setting stage and the monitoring. We should be monitoring how we are getting on compared to what we set out to do.

Metacognitive scaffolding

Any of the questions in the above section could also be provided as prompts for pupils. These questions form metacognitive scaffolds, in that they support the monitoring process which may not take place otherwise. As with other scaffolds, our goal is to gradually remove.

One way to think of the scaffolding would be as a series of prompt questions provided to pupils. The following question types have been adapted from those that Mevarech and Fridkin (2006) used in their study on mathematical reasoning:

Comprehension questions
orient students to articulate the main ideas in the writing task (e.g. What is the text type?).

Connection questions
lead students to construct bridges between the given task and those completed in the past (e.g. What are the similarities and differences between the previous essay question, and why?).

Strategic questions
refer to strategies appropriate for tackling the task(e.g. What stylistic conventions will I need to present in this piece of writing, and why?).

Finally, reflection questions
guide students to look backward, either during the writing process (e.g. Why am I stuck? How should my conclusion link to the introduction?), or at the end (e.g. Is the text appropriate for the audience?).

Teachers could provide generic questions that could be used in a range of contexts, and we would expect to see some transfer across different writing tasks but generic questions have their limitations, and can be complemented by task-specific questions.

As I was writing this blog, I was constantly monitoring, reviewing, checking, comparing, looking back and forward, stopping myself from checking Twitter, comparing to previous blogs, reading out loud, deleting, rewriting, deleting again. What would I do better? That can come in Part 5: Evaluating. 

More from the Bradford Research School

Show all news

This website collects a number of cookies from its users for improving your overall experience of the site.Read more