Research School Network: Diagnostic Assessment in the Classroom Using multiple choice questions to inform next steps
Diagnostic Assessment in the Classroom
Using multiple choice questions to inform next steps
by Greenshaw Research School
Teachers are rightly cautious about the language they use with students returning to classrooms. They know it is not helpful to talk about ‘catching up’ or ‘lost learning’. Their priority is reestablishing the routines and relationships that make their students feel secure.
They know there will be a need to check for gaps in learning and decide what can and should be done about it. This will require assessment. As Marc Rowland makes clear, ‘assessment not assumption’ should guide actions, particularly when thinking about the most disadvantaged.
Different forms of assessment will be needed. Standardised tests can help identify those in need of support, whilst diagnostic tests can provide the focus. In classrooms too, diagnostic assessment can provide teachers with the information they need to respond to their students’ needs. Multiple choice is a great format for this purpose, and could prove to be a useful tool in diagnosing misconceptions or missing knowledge from the remote learning phase.
This adapted example from Craig Barton’s website reveals the wealth of information a well crafted MCQ can provide, even when the wrong answer is selected. Option A is correct, but if a student selects option B it suggests they are confusing area and perimeter, whereas option C that they are applying understanding of how to work out the area of a triangle to a rectangle.
Perhaps less well known are the way multiple choice questions can assess more cognitively demanding thinking. One such approach is called context-based multiple choice questions, where students are given some initial context, such as a text, a diagram or a table, before they then answer questions on it.
Examples of these can be found in Argawal’s (2018) study examining the optimal type of retrieval practice to enhance higher order learning. Her students were given different passages to read, such as an overview of Nicholas II, before then answering multiple choice questions on it. Some of these assessed ‘lower-order’ factual knowledge whilst others assessed more ‘higher order’ thinking.
Both these examples are useful diagnostic tools, but probably at different phases of a learning continuum. At one end they check factual understanding – the basis for deeper learning. At the other end, whether underlying principles, concepts, rules or processes have been understood. Since understanding during lockdown will vary widely, both forms will be useful.
Some handy stems to help teachers write these questions can be found in Berk (1996). Stems such as ‘what would be the most likely effect of…? and ‘Which principle explains…? or ‘The situation will create…?’ can be used in different subjects to assess depth of student learning.
- Which of the following is an example of the principle…?
- Which of the following procedures should be used to…?
- Which generalisation can be made from the data…?
- What feature is a major strength/weakness of…?
- Which approach will result in…?
- The situation will create…?
- What consequence will (likely) result from…?
- What is the first step following…?
Ben Rogers gives some even more practical examples of higher order multiple choice questions. His Trust have developed three main types: Similar/Different, What If? and Apply It. The following are all from Physics, but it’s easy to see how these formats could be applied to other subjects.
Obviously, multiple-choice is not the only format for writing high quality diagnostic questions. The advantage is that once they are written, they can be shared and reused. All the effort is upfront. In classrooms where some of the usual means for checking student understanding has been compromised, they might prove an invaluable means for getting our students back on track.
Phil Stock, Director Greenshaw Research School
Agarwal, P (2018) ‘Retrieval Practice & Bloom’s Taxonomy: Do Students Need Fact Knowledge Before Higher Order Learning?’
Berk, R (1996) ‘A Consumer’s guide to multiple-choice item formats that measure complex cognitive outcomes’
Related EventsShow all events
More from the Greenshaw Research SchoolShow all news
NEW 2021/22 courses now open for Greenshaw Research School!
Take a look at our new brochure and courses coming up in the next academic year
Effective Teacher feedback
Greenshaw Research School’s ELE, Steph Keenan, reviews 3 of the recommendations in the EEF’s guidance report on teacher feedback
What Every Secondary Teacher Can Do About Reading
Tom Needham follows up on the recent webinar with James and Diane Murphy