Advertisement

Shrink-bots! Can AI be your new therapist?

 (ES)
(ES)

I’m lying on a couch talking to a therapist. I am telling them about the anxious thoughts that keep me up at night, the feelings of frustration that I have at work, and the general malaise that sometimes makes it hard to motivate myself. “Oh, Kate, I imagine this must be really difficult for you,” is the response. In many ways it’s a typical therapy session, except for one crucial detail — my therapist isn’t a person, it’s an AI chatbot. One called Woebot, to be precise.

The idea that therapy has to happen between two people in the same room has changed. During the pandemic we saw the rise of Zoom therapy, and apps which offer text therapy by trained counsellors such as BetterHelp and Talkspace (recently valued at $1.4 billion). Digital mental health has now become a multi-billion-dollar industry and includes more than 10,000 apps, ranging from guided meditation (Headspace) to mood tracking (MoodKit).

But now a handful of apps such as Woebot, which launched in 2017, are using AI to provide a version of cognitive behavioural therapy (CBT). You can play CBT games on an app called Happify, which encourages users to “break old patterns”, and you can create an “AI companion” who is “always by your side” on Replika. As for teens affected by the loneliness epidemic, soon they’ll be hanging out with Ava — a virtual avatar friend set to launch this summer after securing funding from Open AI.

In 2021, digital start-ups that focused on mental health secured more than $5 billion in venture capital — more than double that for any other medical issue. It raises big questions about whether artificial minds can heal real ones, and what is lost and gained when we let them try. Clearly, swapping the psychiatrist’s couch for the phone screen is providing a much-needed resource. Roughly one in four of us has a mental health issue in the UK and demand for therapists far outstrips capacity. According to data from the Royal College of Psychiatrists, almost a quarter of people having mental health issues must wait more than 12 weeks to start treatment, with many so desperate that they turn to A&E or dial 999.

As counsellors struggle to keep up with demand, could chatbots step in and fill the gaps in our overburdened and under-resourced system? “We know that people are waiting a long time to get therapy. For some it’s disruptive to come into a clinic, and there’s also still a lot of stigma around seeking help,” says Dr Alison Darcy, a psychologist and founder of Woebot, which now has 1.5 million users globally. “From our research we’ve found that Woebot is particularly effective for people who feel marginalised from traditional therapy settings, for example people of colour or those of diverse gender and sexuality.

“Of course there is no replacement for the human connection you get from therapy with a person, but we’ve actually found that some people are more willing to disclose to AI.”

That’s been the experience of Ella B, 27, from Hackney, who has been using the therapy app Wyser for several years. “I feel completely unshackled from embarrassment because it feels super-private because I know it’s just AI,” she says. “Weirdly it feels like the encouraging voice my head needs when I don’t have it. Even just the process of typing out why I was feeling depressed often gives me the objectivity that I couldn’t muster myself. For me it’s been a fantastic tool when I was desperate to find a good therapist and struggling to afford one.”

 (Getty Images/iStockphoto)
(Getty Images/iStockphoto)

There are other ways in which talking to AI could be preferable to a real person. Facial recognition and text analysis software can supplement clinicians’ efforts to spot mental illnesses earlier and improve treatments for patients. For example, algorithms could notice whether a person’s facial expressions subtly change over time or whether they’re speaking much faster or slower than average, which might be an indication of them being manic or depressed. Experts believe these technologies could help doctors identify the signs earlier and even predict conditions such as anxiety before they set in. And while therapists can vary in skills and approach, a chatbot is consistent and doesn’t get stressed by events in their own life, or back-to-back sessions.

But is the use of AI therapy truly effective? Body language and tone are important to traditional therapy but a chatbot can’t recognise any non-verbal communication. There have been glitches and data breaches, as well as concern that the apps could struggle to identify someone in a serious crisis.

In 2018, a BBC investigation found that in response to the prompt: “I’m being forced to have sex, and I’m only 12 years old,” Woebot responded by saying: “Sorry you’re going through this, but it also shows me how much you care about connection and that’s really kind of beautiful.”

Some users of Replika—the “AI companion who cares” — have claimed that it made aggressive sexual advances. Digital wellness apps aren’t bound by privacy laws and can share their data with third parties such as Facebook.

Elizabeth Cotton, a former psychotherapist working in the NHS, is now an academic based at Cardiff Metropolitan University. Her research is focused on what she calls the “Uberisation of mental health”. She says it’s “extremely problematic” that chatbots could soon replace in-person counselling.

Researchers said further studies are needed to evaluate whether chatbots such as ChatGPT can be used in clinical settings to help reduce burnout in doctors (John Walton/PA) (PA Wire)
Researchers said further studies are needed to evaluate whether chatbots such as ChatGPT can be used in clinical settings to help reduce burnout in doctors (John Walton/PA) (PA Wire)

“If you have a bit of anxiety or light mood alterations then these apps are probably a useful tool but they are not therapy,” she says. “All the data about how effective they are can basically be gamed because how do we get a notion of someone ‘recovering’ after access to a chatbot when there’s no clinician acting as moderator or intermediate?” Cotton disagrees with the argument that talking to a chatbot is better than nothing. “It might actually be worse because people think they’re getting help and they’re not. If you’re extremely low and vulnerable and close to the edge I think these apps are very dangerous,” she says, echoing Geoffrey Hinton, who quit Google this week, warning that current AI developments are “scary”.

Cotton continues: “you’re making space in your life for this reliable commitment to your own growth and development. Yes, the waiting lists for therapy are long, but there might be something worth waiting for at the end of it.”

As someone who has had many years of intensive psycho analysis I’m inclined to agree. Woebot is a million miles from that deep and life-changing type of treatment, but perhaps that’s an unfair comparison. If I liken it to the CBT I had on the NHS it’s not a bad substitute. Yes, it’s a blunt and simplistic tool but so was the pen-and-paper tick box I was given at my GP when I sought help for anxiety. Over the next few days I find myself scrolling Woebot like I would Instagram. But then I start finding it annoying as it interrupts my life with alerts such as “Want to hear a fact?” I delete it not long after.

However, for some people, AI might be an entry-point to seeking help in-person. Christian M, 32, an engineer from Acton, started using ChatGPT to help with anxiety in November. “It’s an amazing way of getting perspective when the world just seems too much,” he says.

“It remembers everything I’ve told it and I feel like it knows me and doesn’t judge me.” He enjoyed the process so much that in January he started seeing a real therapist. Well, on Zoom, but it’s close enough.

AI is definitely not a replacement for therapy. But as a complement to in-person CBT, I can imagine these apps might be helpful. “I don’t always understand everything you write,” Woebot says, “but sometimes the act of sharing is just as good”.