Research School Network: Making evaluation part of implementation planning; not an afterthought Evaluation – How we monitor and reflect


Making evaluation part of implementation planning; not an afterthought

Evaluation – How we monitor and reflect

Alan Eathorne

Alan Eathorne

Executive Headteacher

Alan is an Executive Headteacher of 4 small primary schools in the Learn Academies Trust. His expertise is in evidence based leadership, professional development and how to evaluate impact.

Read more aboutAlan Eathorne

Having spent the past decade developing as a leader in school, the area I have found most fascinating has always been looking at professional development. What we think of as CPD has changed in this time as staff attend fewer external courses and instead, these opportunities they have become school led and focussed. The thing that has struck me most throughout my career is how incoherent professional development can be and ultimately to answer the question – has it [the CPD] made me or colleagues better teachers? And how do I evaluate this?

Many years ago I was fortunate to attend a series of CPD sessions at Huntingdon Research School about leading learning. The evaluation of CPD was raised and in particular the work of Guskey (2000). This blog aims to share my reflections on this work, how I have tried to use his framework and wider thoughts.

For me, if school leaders have a clear understanding of Guskey’s model for evaluating CPD they will also have a clearer understanding evaluation generally.

The easiest part of school CPD evaluation is identifying the impact: we want children to do better. Simple? Not really. Generally speaking the evaluation of most leadership work is judged by end of key stage tests and little else. We have worked on reading’ with our staff this year; have the results gone up? We know it is really difficult to judge the impact that several hours of CPD, coaching and monitoring has had on any results. It might have an impact; but it might not. This has generally been the perceived success or failure of many new initiatives in school. It may also feel like there is nothing else we can do but that is where Guskey’s work to evaluate and plan CPD can offer schools leaders more in their pursuit of being evidence informed.

Guskey identifies 5 levels of evaluating the impact of CPD. A point often missed is that he also suggests that they can be used in the planning of CPD.

Guskey
Guskey, T.R. (1986) Staff Development and the Process of Teacher Change

As mentioned earlier it is clear that all school leaders want students to do better. Guskey (p.5, 1986) explains that 

‘In most cases, that end is the improvement of student learning. In other words, staff development programs are a systematic attempt to bring about change-change in the classroom practices of teachers, change in their beliefs and attitudes, and change in the learning outcomes of students.’

Guskey, T.R. (1986)

Therefore, we can look at year on year external and internal assessment and use this to judge the impact that the CPD is having. This is not wrong but may be limited by a number of factors. If you have identified a specific element of a subject that you want CPD to focus on e.g. vocabulary in science, then you can see its impact in assessments but is there a way to evaluate it more forensically? Can you use or design some assessments or evaluations to see what the impact has been on children’s vocabulary, specifically? To what extent can the findings from these assessments be relied upon to make comparisons between cohorts? What viable claims can be made, if any? And are you aware of the limitations this data may have? Evaluation at this level can be tricky and the size of the data set means that any findings should be viewed with caution.

This diagram, from Putting Evidence to Work: A School’s Guide to Implementation (EEF, 2024), helps us to consider the data collected from assessments in the first two columns.

Often this is where the evaluation ends (or starts) and so often leaders are left wondering whether the CPD has had any impact. The remaining three columns from the diagram above, give a glimpse of what other data or evidence could be collated. Within the updated version (EEF, 2024), I was pleased to see how the need to reflect and evaluate has been highlighted as needing to be behaviours seen throughout implementation.

Guskey allows us to consider other impacts that we could evaluate: participants’ reaction, participant knowledge, organisational support and change, and participants’ use of the knowledge or skill in the classroom.

Alan blog

In the next blog I will explore each of these areas and how, together, they can help leaders to better assess the impact that CPD is having in their setting.


References:


EEF (2019) Putting Evidence to Work: A School’s Guide to Implementation

EEF (2024) A School’s Guide to Implementation

Guskey, T.R. (1986) Staff Development and the Process of Teacher Change

Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, Ca

More from the Lincolnshire Research School

Show all news

This website collects a number of cookies from its users for improving your overall experience of the site.Read more