As I write, we are in the midst of exam marking season. Teachers up and down the country will be regretting signing up to mark exams as they plough through their load of papers, motivated only by a small financial gain for each script they mark.
Ironically, the principles upon which their remuneration is based are also those upon which students will gain success in their exams. For each correct answer given, marks will be awarded. It is a sort of educational piece-rate pay.
The ‘positive marking’ approach is so ingrained in education that we rarely stop to question its efficacy. Marks are gained for correct answers, whilst no marks are gained for students skipping a question or even for getting it wrong.
Why don’t we dock marks for incorrect answers? The logic is that we would rather students wrote something than nothing – give it a go. Penalising students for taking a risk when they are not certain would, the argument goes, lead to more questions being skipped. The psychological mechanism at play (or rather being avoided) is what Tversky and Kahneman (1981) famously called ‘loss aversion’. We want students to choose to write something rather than write nothing. If there is no penalty for doing so, then why wouldn’t you?
But is this optimal? Perhaps not.
Test effort
We want tests to be accurate indicators of what students have learnt. But for all sorts of reasons they aren’t. One very important reason is test motivation. A student’s effort on a test may significantly impact performance such that we cannot make a valid inference about what that student knows.
Low-effort effects are a particular problem for low stakes testing (Finn, 2015) where there are few consequences for students of poor performance. Of course, not all students need consequences in order for them to be motivated to succeed, but the point is that some do. Therefore, what we interpret to be differences in knowledge may be at least partly a difference in effort.
The recent lean towards low-stakes tests to promote retrieval practice in many English schools makes the risk of low-effort greater. If we want tests to provide us with useful information about what students know and don’t know, thinking carefully about test-taking effort is important. Teachers need to find ways to maximise test-effort.
There are many factors which affect test effort by a student. Students will have a motivational disposition towards school and tests which is related to their self-efficacy, achievement motivation, resilience and ability to regulate their behaviour. Test effort is also to some extent subject-specific in that it varies according to the student’s academic identity (for example, they may believe themselves to be ‘good at maths’ but ‘useless at art’). Test effort may even vary from test to test within the same subject (see Wise and Demars, 2005, for a theoretical model of test taking motivation).
In this post, I’d like to focus on one often overlooked factor: the way marks are allocated. Helm and Warwas (2018) found that around 25% of test effort is attributable to the test situation. Mark allocation is one factor within this, and a significant one. The way marks are allocated may help us counteract, at least to some extent, the low-effort tendency of some students in low-stakes tests.
Question skipping
At the heart of this problem is question skipping, the tendency to write nothing and move on to the next question. Why is question skipping a problem?
When we set a test for students, we may have one or more of the following goals:
- To learn what students know, but also what they don’t know (so that we can do something about it if we choose).
- To learn something about students’ misconceptions. Wrong answers can tell us things.
- To promote retrieval of knowledge in order to strengthen memory and recall (known as the ‘testing effect’).
- To maximise performance so that the student is rewarded for their effort and teachers get an accurate view of what students know.
Skipping questions undermines all of these goals.
Framing
The positive marking approach we have become used to is based on the ‘rational choice theory’ we came across earlier in this post. The idea is that students might as well write an answer – in other words they choose not to skip a question – because there is no penalty for getting it wrong. In the literature, this approach is known as a ‘gain frame’ because students can only gain from their effort, and not lose anything for their error.
But while this approach incentivises not skipping, it relies on the accumulation of marks as a motivator. In other words, it assumes that students will try hard to get the right answer because they want to achieve more points. We can see, therefore, that there are two factors in play: choice and effort. It is not sufficient to simply reduce skipping, we must also maximise the effort made by students to put down the right answer. ‘Effort’ in this instance may mean students read questions more carefully and spend more time thinking hard about answers.
In a paper titled ‘The effects of exam frames on student effort and performance’ published in 2022, Ballis and Martorell set out to establish whether the traditional ‘gain frame’ is indeed better than an alternative ‘loss frame’. In a loss frame, students are endowed with marks for the test at the start but can lose marks for incorrect answers (but are not penalised for skipping).
Rational choice theory would suggest that students will skip more questions in a loss frame model because they are loss-averse i.e. if they aren’t sure of the answer it is safer to skip the question than to write an answer that may be wrong. However, the study found that the framing led to less skipping! Not only this, but students scored higher overall on the test.
Loss aversion
What might be going on here?
The authors suggest that there is indeed a loss-aversion effect taking place in that students do not wish to lose the points that have been endowed to them. This effect is supported by the research of Tversky and Kahneman, who found that we are often more motivated by the threat of losses than the promise of gains. We might assume that this would lead students to play it safe by not answering questions they are uncertain about (a choice effect), but in fact it appears that students are driven by their loss aversion to try harder to get the right answers – they think more about the question (an effort effect). This increased effort means that students actually skipped fewer questions (perhaps because the increased effort meant they became more confident about the answer) and answered more questions correctly. The amount of incorrect answers remained the same.
In support of these conclusions, students self-reported making more effort in the loss-framed test and spent more time answering questions. These findings suggest that the increased performance in the test was indeed due to increased effort by students seeking to protect their endowment of marks.
This lovely little study indicates that our assumptions about how students behave towards tests should be be scrutinised. Motivational mechanisms can be counter-intuitive. There is also evidence to suggest that loss-frame models can be applied successfully across an entire course, not just an individual test, to motivate more productive behaviours (Apostolova-Mihaylova et al, 2015, and McEvoy, 2016).
The way we frame a test can directly impact student performance, therefore potentially leading to more valid inferences being made about their attainment, more opportunities for teachers to learn from student error, greater rewards for students, and more significant learning effects through retrieval practice.
It goes to show that assessment design and implementation is not merely a technical process which can ignore human behaviour. Skipping the question of how students are motivated will lead to a false impression of what students know.