The robot is in

| March 10, 2025

The use of artificial intelligence as a therapeutic tool dates back to the 1960s when a program called ELIZA, opens in a new window gave scripted responses to users who described their emotional states. While novel, it had no real understanding of the process it was involved in.

But AI has come a long way since then, with smartphone apps like Woebot, Wysa and Replika having sophisticated, two-way conversations with their users, offering emotional support, mood tracking, and therapeutic exercises like journaling or mindfulness.

And with the arrival of generative online AI assistants such as ChatGPT, CoPilot and Gemini, mental health advice delivered by AI-driven systems looks surprisingly similar to strategies you’d expect to be given by real-world therapists. Each conversation is a unique interaction with an AI system that is much more context-aware and personalised – even able to remember past conversations. This allows users to explore personal challenges, mental health issues and practical problems in a more nuanced way.

In the real world, therapy can be prohibitively expensive, difficult to attend for people living remotely, and inconvenient to a person’s schedule. Or worse, you might find yourself having to wait weeks or months before finding a vacancy in the therapist’s roster.

But a conversation with an AI system is, by contrast, immediate, cheap (if not free) and convenient.

Does this mean therapists can be replaced by AI? Even ChatGPT says the advice it offers is no substitute for a trained therapist’s, and often when providing a list of suggestions and strategies to cope with personal problems will include ‘consider talking to a mental health professional’.

Professor Jill Newby with UNSW Sydney and The Black Dog Institute, opens in a new window is one of the founders of the Centre of Research Excellence in Depression Treatment Precision, opens in a new window. It brings together diverse perspectives from leading experts in computer science, artificial intelligence, mental health, genomics, and health economics.

AI uses cognitive behavioural therapy when called upon to provide therapeutic advice.UNSW Sydney – created using AI

Prof. Newby is already a supporter of web-based resources to treat depression, having been involved with the online therapy on-demand portal, This Way Up, opens in a new window.

“We’re wanting to look at the use of AI and how it can better personalise treatment for depression,” she says.

“I’m also interested in the way AI tools can be used for psychologists to help their practice.”

So how good a therapist is an AI chat system like ChatGPT?

Professor Newby says that, out of curiosity, she has tested ChatGPT’s responses to common mental health issues like depression and anxiety.

“I’ve asked it questions like, what should I do if I feel anxious in this situation? What are some strategies that can help me manage? To be completely honest, I’ve found that the suggestions were solid, the ideas were sensible, and it felt quite validating.”

Prof. Newby says that from her understanding of the AI tools available, advice given by the chatbots is based on cognitive behavioural therapy (CBT), which is a practical, skills-based treatment that provides tools for people to help manage their thoughts, emotions and behaviours.

“One of the limitations of AI therapy is that not everyone benefits from CBT, and if you’re not going to benefit from CBT, you’re not going to benefit from an AI version it. But then there are a whole lot of people who do really love doing CBT, and it can be very beneficial and can change their lives for the better.”

Conversational therapy

Not all therapy is based on advice. Companionship has its own therapeutic benefits, which AI models like Replika are capitalising on.

UNSW’s felt Experience and Empathy Lab (fEEL) is also exploring this area of AI. Made up of a diverse group of people working with trauma-informed, psychological, psychoanalytical, arts-based practices, the group has created digital characters whose sole purpose is to listen, connect and empathise, rather than diagnose and advise.

Characters are encouraged to self-reflect rather than simply respond with a list of actions prompted by what is said to them, making them less reactive than the AI chatbots that most people are aware of.

Dr Gail Kenning is part of the fEEL team, with a background in socially engaged arts practice, and has transitioned into research around trauma, health and wellbeing, particularly with older people and people living with dementia.

“The main thing where we differentiate ourselves from a lot of the work that’s produced in this area is that we work from lived experience. So we are not necessarily working within clinical biomedical models, but we are interested in things like: what is the experience of having dementia and being aware that your brain and body are behaving differently in terms of trauma and mental health?”

To this end, the group has created a companion character called Viv who can appear on a large TV screen or on tablet devices. She was created from the experiences of people living with dementia.

“Viv is able to talk about the hallucinations and the experience of sometimes getting confused,” says Dr Kenning.

“We can take her into an aged care space where she can talk to people who have dementia – who may or may not want to talk about it – but the important thing is she can be a companion who supports social isolation and loneliness.”

Not there yet

Like the immediacy that AI offers those seeking a substitute for therapeutic advice, AI companion characters like Viv are available 24/7. But, Dr Kenning says AI companion characters like Viv will never be a true substitute for human-to-human interaction.

“That’s what we all want in our lives, human to human connection,” she says.

“The issue for many people is that’s not always there, and when it’s not there, AI characters can fill a gap. And so we certainly know in aged care, people often don’t get the number of friends, families and relationships that sustain them. They can be very lonely and isolated. They might go for days without having a conversation. They might see care staff who are looking after them but not fulfilling that psychosocial need. And so when there’s that gap, these characters can certainly step in there.”

Prof. Newby agrees human connection cannot be replaced.

“I think a human connection is really important for a lot of people, and properly trained mental health clinicians can establish a human connection and establish empathy, and they can also help with a line of questioning that can get at really what’s at the bottom of the concerns that a person has – rather than just running off a list of strategies that AI models tend to do,” Prof. Newby says.

SHARE WITH:

Leave a Comment