.

The Increasing Popularity of AI ‘Therapists’: A Reason for Concern?

With 78 million messages having been sent to a bot named ‘Psychologist’ on popular website Character.ai, and millions of others having been directed to other mental health chatbots such as ‘Therapist’ and ‘Are you feeling OK?’ it is clear that a new – and potentially alarming – trend is emerging.

With 78 million messages having been sent to a bot named ‘Psychologist’ on popular website Character.ai, and millions of others having been directed to other mental health chatbots such as ‘Therapist’ and ‘Are you feeling OK?’ it is clear that a new – and potentially alarming – trend is emerging.

More and more young people are becoming increasingly self-aware when it comes to their mental health, and striving to educate themselves and others on mental health-related topics.

Therefore, it comes as no surprise that many are starting to open up about their personal issues, particularly considering younger people’s affinity with technology and AI. Nevertheless, this begs the question: is talking to a chatbot really a step in the right direction?

Tracing the roots of the AI trend

People's upbringing and the environment that they live in significantly impact how they think about mental health, meaning they’ll naturally have different ways of dealing with their personal struggles.

There’s no doubt that there is still a lot of stigma attached to mental health, and the concept of seeking professional help. Many feel ashamed, or believe they’ll be judged by others.

But more and more people need help. The world has changed so much in the past few years, and we have taken on more worries. Young people have had to grapple with the Covid 19 pandemic, the cost-of-living crisis, global instability and increasing climate change concerns.

Coupled with issues such as loneliness, job insecurity and unemployment causing anxiety in a significant number of young adults in the UK, we can see exactly why new generations are struggling.

Many young people are turning to these chatbots due to lack of alternatives. The cost of private counselling or psychotherapy is prohibitive for many, and there’s a considerable lack of resources in the public space which means many are simply out of options.

One in four patients are waiting more than 90 days between their first and second appointments for NHS talking therapy treatment, with long delays on the rise due to an increase in demand since the Covid 19 pandemic.

Should AI be employed in the therapy field?

It is certainly positive that many are beginning to ask for help – just having that conversation, even if not with another human being, is in itself meaningful.

AI has certainly changed the way we operate, allowing for positive innovations in a range of industries such as healthcare and education, and facilitating a number of tasks, freeing up human resources.

Nevertheless, we must consider that there are areas where AI simply cannot – or should not – substitute the human experience.

How can a piece of technology understand what we, as human beings, go through? How can it understand our makeup, our intricacies or why we feel a certain way?

There might be a place in society where these chatbots can be of assistance, such as in the medical profession, law, and other areas where there’s a need for objective information to be retried quickly.

But when it comes to human support and actual feelings and thoughts, that should be a human-to-human experience, as it’s only human beings who will have empathy and any type of real understanding of mental health struggles.

Understanding the risks

When an individual is dealing with something more complex like PTSD, depression, or suicidal thoughts, who is going to be accountable should something go wrong?

There have been cases recorded when people who have used AI as a therapist have suffered a worsening in their mental health state, having received inappropriate, inaccurate, and harmful advice. And therein lies the danger.

Just last year, AI-powered Chatbot Tessa was suspended after having been shown to provide harmful advice regarding eating disorder recovery, with the individual who used the app stating “If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today”.

Aside from the dangerous repercussions of potentially receiving faulty advice, pouring your heart out to a chatbot will also affect social skills and even exacerbate existing issues.

These individuals are potentially replacing one mental health issue for another, placing their trust and emotions into a robotic chat system rather than getting the person-to-person interaction we need as human beings, which can be a cause of isolation and anxiety in itself.

There is no saying what kind of innovations we’ll be seeing in AI in 100+ years’ time. These kinds of software may evolve to a considerable degree in the future, but in their current state, they are simply not fit for purpose.

The consequences of relying on AI for something as delicate and as complex as therapy should be carefully considered, particularly when it comes to young people, who may need further help and guidance to be steered in the right direction.

The need for better resources

The initiative to trial the use of chatbot software to support individuals with anxiety and depression stuck on NHS waiting lists has understandably been met with reticence and concern.

AI-powered therapy chatbots are simply not a viable solution, and certainly not a long-term solution to a lack of support structures and resources.

We need to ensure that schools, universities, and workplaces are putting the right tools and strategies into place to combat this trend, and better safeguard individuals.

There is a wide number of resources all kinds of establishments and businesses can offer individuals, such as free access to therapy sessions or other types of resources they can use to educate themselves.

Additionally, the introduction of Mental Health First Aiders in workplaces can ensure those who are struggling are appropriately supported by individuals who have been adequately trained to assist with different mental health issues.

It’s all about showing people who are struggling that there are ways they can get help without having to resort to harmful AI chatbots, as well as eliminating any stigma preventing people from seeking professional human-to-human help.

To learn more about building the right mental health support system in your workplace, please get in touch today.

Keep up to date and stay informed

Subscribe to our newsletter to receive weekly news and updates

Red Umbrella - Leader Talking.

Contact Us

Do you have any further questions or queries regarding our services and the industries we work with? Reach out to the team by using our online contact form, calling 0300 002 0061, or via email at [email protected] and we’ll be more than happy to advise you.