UG in two minds about ChatGPT
Threat or opportunity?
First, there was astonishment. Was it really possible that an AI tool could handle information and language so intricately? How could this have happened so fast?
Then, there was awe. ChatGPT could really do that! It could answer questions, write poems, letters… whatever you asked it to.
And finally, there was worry. Because how can we be sure that a student actually wrote their essays themselves? What if they just asked the AI to answer the questions for their assignments?
‘You have to realise that, when this came out, there was no policy’, remembers policy adviser Hendrik Visser with the Faculty of Arts. ‘That had to be formulated first. We’re a large and complex organisation, so we can’t just change our educational activities in one fell swoop. That’s not something that can be done in a week.’
No general rules
But now, almost one year after that fateful day when OpenAI introduced ChatGPT to the public, with Microsoft’s Bing and Google’s Bard in its wake, the university has had time to recover and decide what to do with AI.
For one of our courses, our students have to use generative AI
Bart Brouwers, journalism
One thing is clear though: the UG has not come up with general university-wide rules for the use of AI, because ‘there is no one-size-fits-all solution’, as the helpdesk EDU Support states on the UG website. ‘The impact of AI tools depends on the specific context of academic fields, degree programmes, courses, and learning outcomes.’
So how do different programmes and faculties deal with it?
At some faculties, nothing has really changed. ‘It’s too early to get into detailed questions, since there are no major consequences so far’, says board member Manda Broekhuis with the Faculty of Economics and Business. ‘The policy is still evolving.’
The Faculty of Law is not worried either, says student assessor Joris Bouman. Sure, the technology could be used to write papers, he admits, ‘but for the most part, AI is not advanced enough to write legal answers, especially for Dutch law.’
But others within the university have decided AI is an opportunity and they will have to adjust. Journalism, for example. ‘For the course Innovation and Entrepreneurship in Journalism, our students have to use generative AI’, says professor of journalism studies and media Bart Brouwers. ‘Also, as part of my storytelling lectures, I ask the students to think about how they could integrate AI in journalism.’
Yes, he is wary of the use of AI for essays, which are a common part of the assessment methods, but he wants students to be ‘up-to-date’ with modern journalistic practices too. ‘I teach them about the risks and the no-gos, but also about the opportunities. We teach them how to use generative AI for journalistic purposes.’
However, they do have to learn the craft without these tools, he stresses. ‘For the skills part of our master, assignments are closely supervised and always concern current issues with the need for real-life interviews.’
Medical sciences is also mostly positive about the possibilities of AI. ‘It is important that we educate our students in a future-oriented way’, stresses spokesperson Lex Kloosterman. ‘We want to teach our students when this AI tool can add value to learning and development, but also what the pitfalls are. AI is a promising technology with a lot of impact on healthcare and research.’
We believe ChatGPT can be put to good use by students
Together with data science centre DASH, medical sciences has been setting up information sessions for teachers on ChatGPT. The faculty is currently also developing an e-learning module for students on the added value and pitfalls of AI language models. ‘We believe ChatGPT can be put to good use by students, for example to get a start on an assignment or report. So students who are a little less creative or have trouble getting started may be helped by this’, Kloosterman says.
Essays are the biggest possible problem, he admits, but he hasn’t seen any major issues there yet. ‘Our programmes are vocational programmes for which students must, above all, also demonstrate whether they have mastered certain actions. In addition, written tests or essays are often combined with a presentation or discussion.’
Others approach AI with caution. The Faculty of Arts, for example, has established a working group to look at the issues that the different study programmes are presented with.
Visser is mostly worried about the implications of AI for educational outcomes and how that may affect current assessment methods. ChatGPT and similar programmes have added to the ways a student can plagiarise and the faculty worries on how to detect this kind of fraud.
Students should not be able to outsource learning outcomes, he says. ‘You have to do your own work. That’s the vision of the Faculty of Arts.’
But that’s not always the reality. At Campus Fryslân, the use of AI was suspected when lecturers were able to generate highly similar answers to the ones students gave when using certain prompts in ChatGPT.
The UMCG reports a case that the examination board dealt with as well. The Faculty of Arts even reported several instances of plagiarism related to AI.
Hard to prove
No doubt there are more cases, but proving plagiarism by ChatGPT is hard. ‘Only if the student admits it. There are no credible AI detection tools at this moment’, says Brouwers.
The arts faculty agrees that detection tools are not the way to go. What can be done, though, is focusing on persisting weaknesses of the software programmes. ‘AI hallucinations’, for example – the generation of references to sources that do not exist.
It’s reasonable to assume that most programmes will be affected in some way
Hendrik Visser, arts
Fraudulent AI use could also be identified taking the ‘broader educational context’ into account, says Visser. ‘If there is a sudden spike in performance that is not reflective of class performance or earlier assignments, that might be a decent cause of suspicion.’
However, no method will be watertight. The possibilities of AI will keep growing and study programmes – even those who now feel it doesn’t affect them – will have to do some soul searching, says Visser. ‘It’s reasonable to assume that most programmes will be affected in some way. We will have to stay vigilant in our understanding of how these tools work.’
That’s why the Faculty of Arts is developing an AI policy for the faculty as a whole and working to integrate AI tools in education. They will also provide training sessions on AI for the teaching staff to acquaint them with ways they can work with it.
Campus Fryslân, too, has invited students and teachers to discuss the situation critically. It also added a dedicated session on ChatGTP and AI in the first-year academic communication course for University College students.
Teachers will have to think about their goals, Kloosterman says. ‘I can imagine the focus of science courses shifting towards creativity and good ideas, rather than on being able to summarise articles yourself. But that requires teachers and educational developers to think about the content of education. Which learning objectives and teaching methods are enriched by this technology, and which ones aren’t?’
The measures that the university is taking so far are not about scaring people off, but about advising them to stick to the rules, Brouwers adds. ‘They are there to make sure their learning journey will be as satisfactory and successful as possible.’
And don’t forget, says Kloosterman: ‘AI programmes are not cheating tools per definition, but tools that can also be very valuable to use in a student’s education.’