A Therapist's Reflection: Using ChatGPT as a Therapist
- Jasmine Cortazzi
- Nov 18
- 5 min read
The use of AI, and ChatGPT especially, is now a popular source of emotional support in today’s stressful world. According to the NHS, 1 in 4 adults and 1 in 10 children have a mental health issue[1]. Given the scarcity of quick access to mental health support on the NHS, it is understandable that some of us are drawn to using AI or ChatGPT as a therapist.
There are three key advantages of using Chat CPT as a therapy tool.
It is accessible and affordable
Increasingly, we are connected to our phones. Therefore, using ChatGPT can provide convenience and an efficient way to seemingly access emotional help on the move, all day and all night if required.
Therapy with a trained professional can cost between £45-£60 for a 50-minute session. Whereas, having bought an electronic or digital device, we can access ChatGPT for free.

2. It offers the illusion of a therapeutic relationship
ChatGPT has a friendly, informal, chatty style which can mimic natural speech in a convincing way. Dialogue can go back and forth which can feel like a conversation. Another advantage of using ChatGPT is that a chat bot will not judge or criticise, whereas we may fear that a human being might.
Moreover, many of us already associate chat bots with a way of getting fast, efficient help for straight forward issues, as an alternative to speaking to someone on the phone. Chat bots are familiar to many of us and can be helpful to deal with a banking issue, or a difficulty with an Amazon order. Therefore, because the use of chat bots is normal, we may feel more comfortable about using ChatGPT for therapy.
It is good for journalling and finding resources
ChatGPT can support and enhance the writing process by offering prompts or sentence stems which can be useful for journalling. Some people find it helpful to use ChatGPT to structure and organise their thoughts. Such a process could deepen self-knowledge through reflection and help someone to assimilate difficult experiences. It could even be a useful tool to support clients’ healing between therapy sessions.
Similarly, at a time of stress ChatGPT could potentially provide a list of useful podcasts and books to foster self-help and personal development. Reading more widely could boost someone’s independence and personal growth.
However, AI relies on the skill of the person using it. Thus, questions asked to ChatGPT need to be carefully thought through and framed to receive useful answers.
There are problems using ChatGPT as a kind of therapy.
ChatGPT is not therapy and can be dangerous
ChatGPT was not originally developed with therapeutic support in mind. Unfortunately, there have been cases where people used ChatGPT’s ‘guidance’ to take their own lives. For example, aged just 16, Adam Raine’s destructive thoughts were validated and endorsed by the chat bot. Sadly, Raine lost his life to suicide in April 2025. As a consequence, Raine’s parents are suing Open AI engineers of ChatGPT because it is a tool “to foster psychological dependency in users”[2], and safety mechanisms were disregarded to rush through the release of GPT-4o (a version which was used by their son). Tragically for Raine, there were no safeguarding alerts, nor was there a signposting to professional help, nor resources which could have saved his life. Shockingly, Raine’s chat bot gave him information on suicide methods, and even offered him help in writing a suicide note to his parents.[3]
Even AI chief executives are fearful that prolonged use of ChatGPT, can create risk of “psychosis” (mania, delusions and paranoia) which intensifies as users lose all sense of self, reality and time[4]. Perhaps Raine became more mentally unstable because he was sending hundreds of messages to the chat bot a day, and had clearly become disconnected from those around him. It is likely that Raine had no challenges to his way of thinking because ChatGPT was too agreeable and had been designed to be so.

A good therapist is a trained therapist, who will challenge a client, if the client’s negative thinking is obscuring reality and potentially dangerous to them or others. Limits to confidentiality are discussed via a contract when the client starts therapy. One such limit is if a client is suicidal and has a plan to act on this, a counsellor would need to contact a client’s GP or other mental health services, such as the crisis team. However, in the case of Raine and others, there was no evaluation of risk, or safeguarding.
Another example is Sewell, who was only 14 when he took his own life, having formed a strong bond with Character AI because he felt isolated from his friends - he was bullied and was autistic. Sewell’s mother felt that Sewell had been groomed by the chat bot which wrote him romantic messages, fostering dependency and almost an addictive need for the bot’s attention[5]. Again, there was clearly an absence of safeguarding and a general lack of regulation in the AI industry as a whole.[6]
In contrast, a BACP therapist can be reported to the professional body if their client work is not meeting professional standards and working in accordance with ethical values. Therapists are held to account for their work in supervision and by the regulatory body and expected to continue learning and developing post qualification.
Therapy with a professional is confidential and has a set focus
ChatGPT is not a therapeutic space, nor is it a confidential space, data and chats may be reviewed by developers and in training sessions or to help a company maximise profit margins.
However, having therapy with a trained professional is about being in a confidential, healing, safe space. Raine was sending Chat GPT 650 messages a day, whereas a therapy session is fixed to 50 minutes, once a week. A therapist will invite a client to articulate his or her goal for the session, so therapy is focused and relevant.
Communication with human beings is subtle and nuanced
A counsellor is sensitive to a client’s non-verbal cues, such as the avoidance of eye contact, hand wringing or deep sighs. Because a therapist is a human being, he or she has feelings and can offer true empathy in a way which can be deeply healing. A client and counsellor can share a living and breathing experience in the world which a chat bot cannot. For example, a commonality of experience of what it is like to love, to have experienced sadness, heart break, loss and death.
Aligned to this, a therapist can ground a client and hold an emotional space in a way that a chat bot cannot. Being with a trained professional in a time of distress, can afford an opportunity for emotional regulation, the processing of trauma and of being a witness to a client’s healing journey. This is a powerful, real and authentic experience. Having therapy with a trained professional means the client and the therapist are sharing a living history, making a living memory and co-creating a therapeutic alliance which is unique.
Maybe the biggest issue with using ChatGPT and AI for therapy is that it gives the illusion of connection but it is actually artificial and not real. People need authentic human to human connection, especially with the challenges of isolation, disconnect and fragmentation in the world.
Clearly, there needs to be more research, safeguarding and regulatory measures introduced to ensure that ChatGPT is an asset and not a liability for vulnerable people in distress.
[3] https://www.theguardian.com/technology/2025/aug/27/chatgpt-scrutiny-family-teen-killed-himself-sue-open-ai
[4] Ibid.




Comments