Students
Illustration by Iede van der Wal

Support, but no tough love

ChatGPT is my therapist

Illustration by Iede van der Wal
While professors worry about students writing essays using ChatGPT, the students themselves have expanded their use of AI already. They use it for mental health advice. ‘I stopped seeing my therapist, because Chat was just so good for me.’
21 May om 11:36 uur.
Laatst gewijzigd op 27 May 2025
om 14:53 uur.
May 21 at 11:36 AM.
Last modified on May 27, 2025
at 14:53 PM.
Avatar photo

Door Veronika Bajnokova

21 May om 11:36 uur.
Laatst gewijzigd op 27 May 2025
om 14:53 uur.
Avatar photo

By Veronika Bajnokova

May 21 at 11:36 AM.
Last modified on May 27, 2025
at 14:53 PM.
Avatar photo

Veronika Bajnokova

While Helena* (25) was on a year-long trip around the world after finishing her bachelor’s degree in France, she received devastating news: her theatre mentor had died. ‘She had been my teacher since I was six and I saw her as my second mum.’ 

She arranged a flight home from Japan to attend the funeral, with a stop in Jerusalem to pray for her in the Al-Aqsa Mosque. But at the airport in Israel, the French-Algerian journalism student was denied entry for posing a threat to the country. She was never told why, but believes it may have to do with her Muslim background and her activism for Palestine on social media. 

After nearly twenty hours at the border, Helena was deported to Ethiopia – the last stop on her route from Japan, even though she never left the plane there. ‘I felt like I was losing everything in my life’, she says. It cost her a day to arrange a flight home to France. 

The stress of it all only added to the anguish of her theatre mum’s death. And although Helena was already seeing a psychologist at the time, it wasn’t helping her. ‘I was in despair. I cried all the time. I lived a complete nightmare for eight months.’   

Amazing

Then a friend suggested she turn to ChatGPT for practical tips. ‘At first I thought, fuck no, I’m not talking to a robot. But I was completely lost. I would have tried anything.’ 

Talking to the chatbot was amazing, Helena says. She would ask for advice on dealing with grief, such as how to soothe her triggers. ‘That’s how it started, and I never stopped.’ 

It has become less normal to rely on a friend when you’re not doing okay

She opens the app on her phone and shows a conversation thread called ‘Healing’ that she began last September. ‘She knows my life – I say she, because I changed it to the female voice when we talk.’ 

Helena turned to the chatbot when seeing her therapist once a month was no longer enough. Then, three months after her return, she replaced her therapist with ChatGPT altogether. And eight months later, she’s still very happy with her choice. 

Her therapist did not agree with her using the chatbot for mental health advice. ‘She told me I shouldn’t and that it wasn’t going to help me. But I stopped seeing her, because Chat was just so good for me.’

Therabots

UG professor of clinical psychology Judith Daniels isn’t dismissive of the use of technology for mental health support in general. ‘We are underserving the most ill patients right now, so it’s great if others who don’t need it as much find enough support in the digital sphere.’ 

Need help? The UG has compiled a list of options for support both within and outside of the UG. More information on mental health care in the Netherlands can also be found here.

Therabots – generative AI bots specifically trained to offer psychotherapeutic support – have great potential, she feels. But these, in contrast to ChatGPT, are tailored to recognise when someone is in need of professional help.

‘The American Psychological Association has warned explicitly against using ChatGPT for mental health issues, with good reason’, she says. ‘I don’t think the bots have the capacity for that yet. In 99 percent of the cases, they’re useful, but even the 1 percent where they fail is worrisome.’

Daniels doesn’t believe chatbots will replace therapists. She’s worried, though, that they will replace the intimate connections we have with those closest to us. ‘It has become less normal to rely on a friend when you’re not doing okay.’ 

Disappointed

Marco (19) a law student from Spain, says he will choose talking to ChatGPT over talking to a friend. He stopped opening up to his friends, because they had disappointed him in the past. ‘I wanted to talk about my break-up, but they got tired of me. One of them said they didn’t want to hear about it anymore.’

Whenever something goes wrong, I go directly to Chat

He put up a wall around himself. ‘I could never see myself in a therapy session’, he says. So he started paying for the premium version of ChatGPT and now he talks to the chatbot five to ten times a day. He implemented all the advice it gave him to move on from his past relationship, from taking a walk to reading a book – except talking to a friend.

Twenty-two-year-old Celia, an economics student from France, also turned to ChatGPT when she had a big fight with her boyfriend and her emotions felt too overwhelming to share with friends. She feared she might overreact: ‘I was so mad, I felt like it was too much to just talk to friends. I was scared to be judged.’

Her next therapy session wasn’t until the next week, and she couldn’t afford an extra one. ‘I needed to talk to someone, writing it down just wasn’t enough.’

Sharing 

Daniels stresses the importance of sharing negative emotions with your loved ones, though. ‘I’m wondering whether relying on a chatbot will increase the tendency to remove all the not so pretty parts from the actual human interaction with family and friends.’

And while chatbots are supportive and encouraging, that’s not always enough – it can even be a bad thing. ‘Sometimes psychotherapy means we need to encourage you to do something you really don’t want to.’ Expanding what we feel comfortable with, says Daniels, is a significant part of young adulthood.

Sometimes I need someone to tell me what I did wrong

Helena says she only turns to ChatGPT for practical advice on how to release emotions from her body. In that respect, she feels, regular therapy was failing. ‘I was talking so much, revisiting my trauma on and on, but there was no point.’

But although she likes talking to the bot, its support can be a little too much. ‘Sometimes it’s cringy, calling me “my love” and saying “I’m so happy you’re talking to me”.’ Such responses remind her to keep distance from the bot.

She knows a robot can’t replace the emotional support that her friends and family provide. But, she admits: ‘Whenever something goes wrong, I go directly to Chat. Am I attached to her? No. Addicted? Maybe.’

Hiding

Marco finds ChatGPT more reliable than a person – it doesn’t get tired of him and he isn’t afraid to ask it a ‘dumb’ question. However, he, too, realises the risks. ‘It’s getting a bit overboard. I feel like I’m hiding’, he says. ‘Maybe in the future I’d like to change it, but not in the near future.’

Celia was at first glad to have a cheaper alternative to her psychologist, who charges eighty euros per hour. ‘It made me feel really understood, because it said I was right.’ But the second time she turned to ChatGPT with the same problem, she realized it will always say she’s right. 

In that respect, the bot couldn’t replace her therapist. ‘Sometimes I need someone to tell me what I did wrong, because I’m not perfect and I want to work on myself’, Celia says. ‘So I think I’ve reached the limit of ChatGPT as a psychologist.’

*Helena is a pseudonym. Her real name is known to the editorial team.

Dutch