Students embrace artificial intelligence
‘ChatGPT is my assistant’
How can you explain the theory of relativity in a nutshell? How do you summarise Joe Biden’s election programme? Or how the parasympathetic nervous system works? And do you have any academic sources to confirm this?
Walking past the hard-working students in the UB typing away on their laptops, you might think they were all talking to the same customer service programme. Where before, they used to use Google and Smartcat – the university’s online library – to find answers to their questions, they’ve now discovered something new: ChatGPT.
The programme, which was launched in November 2022 by OpenAI, an American research laboratory for artificial intelligence, can find the answer to just about any question you can think of in a matter of seconds. It can even write a proper essay.
Practice exams
Nearly everyone has either tried the programme or knows someone who has, a quick poll shows. ‘I mainly use ChatGPT to make reports or practice exams’, says bachelor student Noah. ‘For my reports, I have it write short bits. That way, they’re the most accurate.’
He creates practice exams by giving the programme a lot of information and then asking it if it can turn that information into an exam. ‘The last time I tried it, ChatGPT even came up with questions that were on the real exam.’
ChatGPT even came up with questions that were on the real exam
Master student Thijs also uses the programme to help him write papers. ‘I had to create a theoretical model and make connections based on earlier academic articles. ChatGPT’s explanation was really convincing.’
Jesse, an international relations bachelor student, is equally enthusiastic about the programme. ‘I use it as a form of inspiration for my papers. It can quickly come up with new arguments and viewpoints, and it’s great how it can explain a lot in very few words.’
Creative
Biomedical technology PhD student Ari Rolando Ortiz Morena has also noticed the impact of ChatGPT among his friends. He first encountered the programme during the winter holidays, when a friend’s brother claimed to have found the perfect solution to writing a thesis. ‘I was taking a break from my research because of the holidays, so I tried out ChatGPT for a lark. I quickly realised just how good the programme is’, he says. ‘It was also really good at coming up with creative rap lyrics.’
Once back in Groningen, Ari tried to use the programme to help him code. ‘It knew exactly what I was asking about my data. It actually worked’, he says, enthused.
Computer science bachelor student Marc Zhou Toneu, who uses the programme to code and summarise sources, was also delighted. ‘ChatGPT is ground-breaking’, he says. ‘None of it could have existed a few years ago. Similar models back then took weeks to answer the kinds of questions this model is answering in a matter of seconds.’
Weaknesses
Ground-breaking or not, the programme does have a few weaknesses. ‘Don’t use it for academic purposes; it’s crap’, international business and management bachelor student Annike Steffen says. ‘The more you try to explore a topic, the less it knows about it. I won’t be using it for my studies.’
Fellow student Sophia isn’t sure, either. When she saw how a friend had used ChatGPT to write an entire assignment for their econometrics course, she was curious. ‘It had processed all his mathematical formulas flawlessly. After he’d made a few adjustments here and there, he scored an 8.’
But when she tried it herself, the answers were much too vague. ‘I knew that if I handed in what I had, I’d fail the course’, she says.
Writing tool
Professor of computational semantics Johan Bos prefers to call ChatGPT a ‘writing tool’. ‘The programme is really good at rewriting, summarising, or even changing the tone of a text.’
It’s also an interesting source of inspiration and can help people overcome their fear of writing. ‘You can just ask the language model to make a start’, says Bos. ‘My colleagues and I are amazed at how creative the programme is. It feels almost paradoxical: a creative computer.’
The more you try to explore a topic, the less it knows about it
The thing that makes ChatGPT different from other search engines is that it can also say something about the text it consists of. ‘It’s got not one, but two layers. The first layer forms the basis of the programme, and the second layer is the interactive part, which can comment on the text it receives from the first layer.’
The questions and requests that users present to the programme are evaluated behind the scenes by the people at OpenAI. ‘That means the programme isn’t learning on its own’, Bos emphasises. The evaluations are fed back to the language models’ second layer, which enables it to ‘learn’.
Bits and pieces
That also means the programme is potentially unreliable. ‘It’s not plugged into the internet and “learns” indirectly from the people behind ChatGPT. It can’t check its own answers’, Bos explains. ‘The language model doesn’t have a correct answer for everything. It’s just putting bits and pieces together to produce a nice text, but it doesn’t actually know what any of the words mean.’
Additionally, the programme really only works in English, and it doesn’t know anything about events that took place after 2021. ‘The first layer of the programme consists of data from before 2021’, says Bos. If you ask when exactly Putin invaded Ukraine in 2022, it reacts as follows: ‘I’m sorry, I don’t have any information on Putin’s invasion of Ukraine, because those are future events and my training data doesn’t extend beyond 2021.’
Annike also emphasises the pitfalls of ChatGPT and similar AI programmes. ‘It’s important to remember that it can’t do any academic thinking for us. The programme doesn’t know which sources are reliable. Sometimes it even comes up with sources that don’t even exist.’
Simplistic
That’s why Marc doesn’t dare to copy the answers. ‘I consider the programme to be more of an assistant’, he says. ‘But if the programme improves – and it will – and becomes able to come up with real and proper sources, then that means it can write a thesis. So you could hand in the most important assignment for a bachelor diploma without doing anything for it.’
Should you list ChatGPT as your co-author, is it fraud, or are you the sole author?
‘I never copy anything directly from ChatGPT’, says Jesse. ‘In part because the text it creates is fairly simplistic. They usually contain mistakes that you miss at first glance. The programme can find real academic articles, but it makes up the page numbers since it doesn’t actually have access to the source.’
Thijs does copy the programme’s answers ‘pretty much entirely’. ‘If the university ever comes up with rules about the use of the programme, I’d find an alternative to circumvent the detection of artificial intelligence’, he says. ‘I’d first ask ChatGPT to write me something and then have the paraphrasing programme QuillBot rewrite it.’
Policy
‘I do think it would be a good idea for the university to write a policy on this to ensure a uniform approach to using the programme’, says Bos. One thing to consider is copyright when using artificial intelligence. ‘Should you list ChatGPT as your co-author, is it fraud, or are you the sole author?’ Bos wonders. ‘Perhaps we could use watermarks in some way to show that the programme helped write a text.’
Annike also thinks something needs to change. ‘University policy currently states that you have to write your final thesis yourself. To me, that says that ChatGPT isn’t allowed. But why wouldn’t it be allowed if students check the text and can substantiate it with academic sources. Plus, I think the programme has the potential to become more reliable.’
As far as the students are concerned, ChatGPT is here to stay. ‘I think more companies are going to start creating programmes like these’, says Marc.
Bos agrees with him. ‘But you can’t just copy things from ChatGPT. As long as you give it the content it needs and check the text, the language model is a great start. The programme has great potential, as long as you use it right.’
The names Noah, Thijs, Jesse, and Sophia are pseudonyms.
No UG policy as of yet
The University of Utrecht recently announced it would be redefining its rules on plagiarism with programmes like ChatGPT in mind. Students will soon have to inform the university if they use any kind of AI chatbot for essays or tests.
The UG hasn’t formed a policy yet, says spokesperson Elies Kouwenhoven. ‘Since there are big differences between the faculties, it’s tricky to come up with an unequivocal approach. Students and lecturers at artificial intelligence will be much more involved with ChatGPT than the people at religious studies, for instance.’
However, the department of computational linguistics, several exam committees, and the educational policy department at the UG are working on a course of action. ‘Together, we’re trying to figure out what the programme means for our education and our research’, says Kouwenhoven.
What makes establishing a set of rules even more complicated, she says, is that the university can’t be sure how other chatbots will develop. ‘The programme might look very different in a month. And then there’s the ethical question of how people should even approach the programme.’