Research School Network: Implementation and Planning to Sustain Some reflections on sustaining and scaling up a strategy
—
Blog
Implementation and Planning to Sustain
Some reflections on sustaining and scaling up a strategy
Share on:
by Kingsbridge Research School
on the
To be, or not to be
We often introduce programmes or strategies assuming they will continue indefinitely. Very rarely do we state or even think about how long a particular practice is supposed to last. But, of course, there are all sorts of options in between infinitely ’to be’ and imminently ‘not to be’. We might decide to
- Maintain the practice in its current form
- Scale up the practice
- ‘Institutionalise’ the practice
- Extend or restrict its reach
- Improve it, or improve aspects of it
- De-implement it to make way for something that’s likely to be more effective
- Specify its ‘shelf life’
Rather than pondering these decisions in year two or three when a practice is already under way, we would be well advised to think about them during the ‘Prepare’ phase. In other words, we should plan our ‘sustain’ strategy in advance.
Programme drift and voltage drop
If we don’t actively sustain a practice, we can expect to see a loss of effectiveness. The term ‘programme drift’ refers to a situation in which the expected impact of an approach diminishes over time, often because people stray from the core principles – or ‘active ingredients’ – that make it work. Alternatively, they might substitute their own, less effective versions – known as ‘lethal mutations’.
An implication of this is that part of our sustainability strategy should be to reinforce the active ingredients. We might, for example, use an existing coaching infrastructure to support fidelity to the approach, or regularly refer to a Teaching and Learning manual to prompt the desired behaviour.
Practices often lose effectiveness when they are scaled up, a phenomenon known as ‘voltage drop’. As we try to scale our reading comprehension strategy, for example, from Year 7 English lessons to all subjects in KS3, we might encounter several sources of friction. Many might view it as an imposition on their role (‘I’m not an English teacher – I don’t have time for this!’). We might find that there isn’t a shared understanding of the evidence and rationale for the approach, leading to a damaging lack of buy-in. We might find we don’t have the capacity we need to support individual departments and teachers with the approach. ‘Voltage drop’, then, refers to the decreasing effectiveness we see as we attempt to scale up a strategy.
Rather than assuming that things like readiness, capacity and expertise will remain constant, we should, in the words of the EEF’s implementation guidance report, ‘treat scale-up as a new implementation process’, checking that the problem is still the problem and that the approach is a good fit for the expanded context.
Monitoring as part of the development process
However, not all view programme drift and voltage drop as inevitable:
“. . . we reject the notion that an intervention can be optimized prior to implementation, and explicitly reject the validity of 'program drift’ and 'voltage drop.’ Rather, we suggest that the most compelling evidence on the maximal benefit of any intervention can only be realized through ongoing development, evaluation and refinement in diverse populations and systems. Instead of viewing contextual factors as interfering with the delivery of an effective intervention and needing to be controlled, we see the opportunity to learn about the optimal fit of an intervention to different care settings.”
Which is to say:
- no programme is so perfect that it can’t be improved;
- the way to improve it is through ongoing development, evaluation and refinement in real-world, complex situations; and
- barriers and threats are actually learning and improvement opportunities – as long as we actively look for and reflect on them.
This is a pragmatic approach that mirrors what goes on in classrooms when we take an adaptive teaching approach. Based on the diagnostic information we’ve gathered, we do the best we can to anticipate and plan for barriers. At the same time, we don’t fool ourselves that we’ve hit upon the perfect strategy. We actively watch and monitor what happens when it lands in the real world. What contextual challenges arise? Where’s the friction? Where does it land better than expected, and why?
What makes this approach to sustainability so appealing is that it views this theory-meets-reality moment as a fundamental part of the development process. Instead of shrugging and saying, ‘Oh well – programme drift. What can you expect?’, we reject the inevitability of decline and embrace the opportunity to find out how it can be improved in its given context – an opportunity we don’t have access to prior to launch.
Related Events
Show all eventsMore from the Kingsbridge Research School
Show all newsBlog -
Whatever happened to the Gathering and Interpreting Data tool?
Jon Eaton, director of Kingsbridge Research School, discusses some of the changes in the new Explore phase
Blog -
GUEST BLOG: Maddy Adams from The Woodroffe School on oracy
How adopting a reflective outlook can refine classroom practice
Blog -
Putting evidence to work: implementing a peer tutoring programme
Hannah Cox, Co-Deputy Director of our Research school, discusses how she’s used evidence to guide an approach to peer tutoring