Did sixty students really cheat?
Fraud or fluke
It might be the biggest case of exam fraud in the history of the university. Even when the Faculty of Economics and Business declared exams invalid in 2020, it only involved 116 students over three different exams.
But on January 26 of this year, at least sixty students – out of 154 – supposedly cheated on the online Sociology of Arts II exam. Supervisors detected uncanny similarities in the way students phrased their answers. They also found something else: many of the students had the same IP addresses, indicating they had been in the same physical location.
The situation seemed clear-cut. The supervisors notified the board of examiners, which decided that with so many students cheating, the exam’s validity could not be guaranteed. And so, on February 22, all students were notified by email that their exam was considered invalid. Everyone had to take the resit, whether they had been caught cheating or not.
However, the board of examiners still doesn’t know who cheated and who didn’t. Which raises the question: how many students really did commit exam fraud? And is there another explanation possible?
Students believe there is. They’re infuriated. ‘Why did they tar all of us with the same brush? It’s simply unfair’, says pre-master student Wendy Zhao. ‘I had bought a ticket to Prague for March 26, but they announced a resit for the 30th, so my ticket had to be voided. And in addition, we had at least three deadlines on those days.’
It’s impossible to have one-on-one talks with everyone
They’re also surprised about the way things were handled. ‘We thought people would be flagged and contacted afterwards. But no one has been contacted’, says eighteen-year-old Raphael Edde.
At a hearing on March 23, where the board of examiners and lecturers explained the situation, hardly any details were mentioned, says Wendy. ‘We asked several times about the exact number of fraud cases, but the board declined to give it.’ According to Raphael, ‘the whole tone of the meeting was kind of condescending. One member of the board said they didn’t know if people cheated intelligently’.
Voiding the exam for everyone really was the only option, according to Quirijn van den Hoogen of the faculty’s board of examiners. ‘There were 160 students in this course. It’s impossible to have one-on-one talks with everyone. At a certain point you feel the whole exam is compromised, and that’s when you stop any individual-level investigating.’
They had the students’ best interests in mind, he explains. ‘As a board of examiners, our overriding concern is the validity of our students’ diplomas. Those should be beyond any suspicion.’
The only thing students were told they could do was lodge an appeal to the Central Portal for the Legal Protection of Student Rights. But it would take ten to eighteen weeks for a decision to be made: too late to change anything about the situation at hand.
There are other explanations for the similarities in their answers, the students think. One of them suspects the anti-plagiarism software might have flagged cited references as ‘copied answers’. And Wendy feels the content of the exam may have been of influence. ‘When you ask about definitions and concepts, it’s hard for people to change the words in their answers.’ The students also used a WhatsApp group to share notes and prepare. ‘And that means we may end up making similar mistakes’, Wendy says.
Raphael, too, isn’t surprised about the similarities. ‘People will have similar answers when they use the same material. We are all given the same information and the same tips on how to write answers.’ Students also use the same mock exams, he says.
People will have similar answers when they use the same material
The students’ IP addresses aren’t exactly a smoking gun either, Raphael notes. ‘I was uncomfortable knowing that they use those to track down our location without us even knowing’, he says. But what’s worse is that IP information may cause false positives. Students can have the same address because they take exams in a public place or in student accommodation. ‘It almost seems like confirmation bias’, he feels.
The supposed fraud may even have been a case of miscommunication. Until the very last moment, students didn’t know whether the exam was open book or not. ‘When I asked one of the lecturers, they said: it will be equivalent to open book, because we can’t stop you using external materials’, one student says.
The board of examiners stands by its decision, though. The exam was not open book, Van den Hoogen says. ‘Given the fact that you are at home, we cannot control what material you have available and what you used during the exam, but the student pledge is quite clear: it says you present us your work. That means you cannot copy answers from others or materials from outside.’
He doesn’t believe the alternative explanations for the similarities either. ‘There is a big difference in the way students formulate definitions or describe a certain term’, he says. ‘That variety was not there, because that’s what the plagiarism scanner checked.’
The IP check was only the very last part of the puzzle. ‘If we hadn’t found the other similarities already, there would be no need to check them. That’s also the reason why we didn’t want to go into the individual level.’
The fact that the students are upset about the situation seems a healthy response to him. ‘From the perspective of the board, I even think I would want them to be upset, so that they understand that there is an issue’, he says. ‘We take it seriously: copy and paste is simply not allowed.’
Van den Hoogen believes that the use of better software is the main reason so many fraud cases are detected these days. ‘In earlier years, we had maybe one or two plagiarism cases a year, and now we talk about plagiarism cases at every meeting. But maybe it’s also because younger generations are far more used to copy and paste.’
In this country, people are innocent until proven guilty
However, not all faculties treat cases like these the same way. When those three exams at the Faculty of Economics and Business were declared invalid, the board of examiners had flagged all cheaters, spending many hours to research every individual situation.
And when the Faculty of Spatial Sciences found a sharp rise in the number of cases of exam fraud last year, their board of examiners dug into that situation, too. ‘We believe the increase was caused by the use of fraud-detecting software’, says board secretary Erik Meijles. ‘But that doesn’t mean there are more fraudsters; the software just detects similar texts.’
Meijles says it is then up to the examiners to check if it really is fraud. ‘We have an example where multiple students gave the same answer, but it turned out they all had a document to prepare for the exam, and part of it was from the teacher’s notes. So in the end we decided it was not fraud.’
The fact that the arts faculty’s board of examiners did not flag every single case would be unacceptable in his faculty, he says. ‘The board should carefully consider before declaring an exam completely invalid, because you are negatively affecting students who didn’t commit fraud.’
Meijles understands that the number of cases might make the situation hard to research. ‘But I would be very careful to claim fraud has been committed. Because in this country, people are innocent until proven guilty.’