AI is having a profound influence on the healthcare space across multiple domains. Notably, however, the technology is often more of a support tool than a wholesale replacement for human expertise.
Priced out of in-person mental health therapy and faced with long waiting lists on the UK National Health Service (NHS), as ChatGPT eclipses 700 million global weekly users, AI chatbots are growing in popularity as “therapy substitutes”.
Discover B2B Marketing That Performs
Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.
According to the British Medical Association (BMA), an estimated one million people in the UK were on waitlists last year for access to in-person mental health services. In the US, past research reflects that access to in-person mental health services is similarly challenging, with cost and stigma cited as the key barriers to in-person mental healthcare, along with a growing shortage of mental health professionals.
Given that AI chatbots such as ChatGPT offer a free and more accessible alternative for individuals to discuss their mental health concerns, it is perhaps unsurprising that their utilisation instead of in-person therapy is becoming popular.
However, Kim Rippy, trauma and anxiety specialist and owner of US-based Keystone Therapy Group, told Medical Device Network that while AI chatbots are useful in helping users to summarise or organise their thoughts, they will miss things that are necessary to consider when generating responses to an individual who is struggling with their mental health.
Rippy explained: “The likes of ChatGPT can’t pick up on nuances of language, behaviours, non-verbal cues, tone, syntax, and emotion that a human therapist can.
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalData“A good therapist doesn’t just reflect back your own thoughts as ChatGPT does, but they join with you in your emotional experience to offer safety and guidance to better understand yourself and your relationships with others or the world around you.”
Critically, Rippy noted that AI chatbots are unable to gauge when someone is at-risk and may inadvertently push someone past their ability to safely regulate themselves. Neither is AI set up as a mandated reporter with knowledge of recognising when action is necessary such as when an individual expresses suicidal intent.
Rippy continued: “AI is a tool, but not a relationship. Think of it as a computerised mirror: it reflects what it receives.
“A therapist reimagines what they receive. For example, my clients show me an image of their trauma throughout their lives. My role as their therapist is to join with them in their trauma experience while establishing a new felt sense of safety for them, which they were never provided before. This is something only a connected human can do.”
