artificial intelligence

Who is using AI chatbot therapists? Here's what to know

As people turn to artificial intelligence chatbot therapy, some mental health experts warn of possible dangers 

Emily LeBlanc, Alisha Small, Dedra Lash

Citing increased difficulty accessing traditional therapy, some people are turning to artificial intelligence chatbot therapy apps. 

As AI technology has developed, it has intersected with many cornerstones of everyday life, including varying types of psychotherapy

Inside their clinics, some therapists are using AI for administrative work, note-taking, and training new clinicians, the American Psychological Association said in a June piece. By using AI to complete various tasks, it could create more space to care for clients. 

But, despite its possible ability to increase bandwidth for care, therapists acknowledge mental health care can still be unaffordable and inaccessible for many, according to Dr. Paul Nestadt, an associate professor of psychiatry at the Johns Hopkins School of Medicine. 

This reality is leading some people to invest in an oftentimes free or very inexpensive tool that is available on their phone: AI chatbot therapy apps. 

Easing depression

Emily LeBlanc, of Taunton, Massachusetts, used a wellness app called Happify for over a year. 

The app is "designed to help address symptoms of stress, anxiety, and depression, with activities based on CBT, mindfulness, and positive psychology," Ofer Leidner, co-founder and president of Happify, said in a statement to NBC. "While our app has been validated to show clinical results, we do not recommend Happify as a replacement for therapy or clinical support. Rather, it can be a powerful and effective complement to mental health care from a licensed therapist, social worker, or psychiatrist."

The app includes an AI chatbot therapy feature. LeBlanc said the feature helped transform her life, but it shouldn’t be used in all cases. 

When LeBlanc began using the app, she was asked a set of questions about her age, pre-existing conditions and specific issues she was struggling to work through. 

Based on her answers, the app suggested a few learning courses to help combat negative thoughts, gave her daily prompts to promote positive thoughts and encouraged her to meditate. 

She was then introduced to the app’s AI chatbot, Anna. 

Anna asked LeBlanc questions about her life and to identify supportive friends she had. LeBlanc said she routinely told Anna what was happening in her life each day.

“I’m having a really rough week,” LeBlanc said she told Anna.

“Oh, I’m sorry to hear that. Maybe you should contact your friend Sam?” Anna said back to LeBlanc, prompting her to reach out to a supportive friend. 

After working with Anna for over a year, the app analyzed her use and gave her the statistics behind her positive growth. She was given the option to share those statistics with other users to encourage the community. 

Despite the success she experienced using the app, LeBlanc does not believe Happify is a replacement for human therapy. 

“I don't think that it is the same as talking to somebody one-on-one,” LeBlanc said. “I think that what it does is give daily consistency.” 

Addressing emotional eating

Alisha Small, an entrepreneur and mom of three living in Maryland, had a similar experience with Wysa. 

Wysa has had over half a billion AI chat conversations with more than five million people. The app is intended to be used as a reflective space with supplementary mental health resources for individuals, serving as an adjunct to human therapists and counselors, the company said in a statement to NBC.

"Wysa is not designed to assist with crises such as abuse, self-harm, trauma, and suicidal thoughts. Severe mental health concerns such as psychosis are also not appropriate for this technology," Wysa said when asked about any disclaimers they give users.

The app uses rule-based AI, rather than generative AI, meaning all responses to users are drafted and approved by the Wysa team.

Small used the app in 2020 to begin treating high-functioning depression and emotional eating, she said. 

“There were so many things happening,” Small said. “This flexibility of being able to manage my mental health right from my home … that was the best thing about it.”

When Small started using the app, she picked the type of services she needed most and was then introduced to the AI chatbot therapist. 

After speaking to it any time she needed, she found it was positively impacting her mental health. 

“I felt like I had that relationship because of the accessibility,” Small said. “I can talk to someone, I can get what I need at any time, so that was a form of relationship.” 

She believes having access to the tool at any point during the day or night helped change her lifestyle. 

Since then, she’s lost over 50 pounds, moved states away and established a successful coaching and consulting business, she said. 

“It’s just helped me grow in all areas of my life and it’s helped me build a network from the ground up,” Small said. 

She now sees an in-person therapist, but said the AI chatbot therapist was a great tool to help her address and create a treatment action plan. 

Treating social anxiety

Dedra Lash, a working single mom in Arizona, also found success using Wysa’s AI chatbot therapist. She uses it to address social anxiety. 

Lash said the feature lets her “vent without someone” judging her. 

She’s been using the app for over four years and is grateful for the support it’s provided her, but recommends anyone needing support for severe trauma see an in-person therapist. 

“It really just depends on the severity of your needing help,” Lash said. 

Some mental health professionals warn against AI chatbot therapists entirely, though. 

Possible dangers of AI chatbot therapists

Nestadt said he is not surprised by the recent rise in these apps, but believes the services likely can’t handle specific and fragile cases, making them potentially dangerous. 

“When we engage in therapy, there’s a lot of risks to confronting someone about a problem that they have,” Nestadt said. “Every therapeutic confrontation is, at its heart, a treatment … you say the wrong thing, you can do real damage to somebody.” 

Despite several AI chatbot therapy apps flagging the use of certain words and phrases, like “suicide” or “hurt myself,” Nestadt believes the technology can’t care for people in crises, or know the depth of what a person is experiencing. 

Nestadt also said there are mental health crises that may not be flagged by the keywords programmed into the AI chatbot therapy apps. 

Additionally, if a user uses a keyword, their conversation is flagged and they are sent to an in-person provider, there remains the issue of staffing shortages and a human therapist needed at all hours of the day, Nestadt said. 

When an in-person therapist is notified of a flagged conversation and begins to speak to a user, they likely do not have the necessary context to treat them well, he said.

“This could lead to a slippery slope of shoddy mental health care that arguably might do more harm than good,” Nestadt said. “In a way, it’s a watering down of mental health care, which can cause real problems.” 

In response to the increased accessibility the technology could provide, Nestadt said there are other short-term treatment options focused on self-regulating that could be far less harmful than AI chatbot therapy apps. 

He called on those considering AI chatbot therapy apps to remember the bigger picture. 

“AIs are tools, just like any other technology, and there are certainly benefits to AI as long as we use them correctly,” Nestadt said. “But, as we get very excited about any new tools, there’s always a tendency to … over-rely on it before we really understand its limitations and potential dangers.”

Contact Us