Skip to main content
Arts & Entertainment

Therapy brought to you by Silicon Valley

Lu Chen/The Standard

About three years ago, a sudden spate of panic attacks hit Eric Vaughan out of the blue. 

Initially, the 42-year-old marketing professional from Allen, Texas, thought he was suffering from some inexplicable medical condition. But after doctors determined that there was no underlying physiological cause for the attacks, Vaughan stumbled upon Youper and started using the AI-powered “emotional health assistant” to track his mood. 

For 260 days, he religiously logged his ups and downs using the San Francisco-based app. Eventually, Vaughan’s symptoms receded, and he stopped using Youper regularly. While he can’t give the app all the credit for stopping his panic attacks, Vaughan said that it did give him a better understanding of the relationship between his moods and mental health. And although he found checking in with Youper’s chatbot a bit tedious, he said it was preferable to hashing out his feelings with a human therapist.

“Therapy isn’t really my jam,” Vaughan told The Standard via text. “So [I] managed it with medication, sleep and Youper.”  

Youper.com is an AI-powered “emotional health assistant." | RJ Mickelson/The Standard

For some, AI-powered chatbots may sound like the stuff of dystopian science fiction. But others see the technology as a viable stopgap solution to the U.S.’s mental health care crisis.  

The Mental Health Care Landscape

In recent years, AI-based mental health care startups like Youper and Woebot Health, also headquartered in San Francisco, have raised millions of dollars in funding. About 22% of adults have used a mental health chatbot, according to a 2021 survey commissioned by Woebot Health, and 47% of respondents said they would be interested in using the technology. 

In the U.S., over 50 million people experience mental illness, and many do not receive timely treatment.

Close to half of Americans who needed mental health care in the past 12 months did not receive it, and 67% said finding mental health care is harder than connecting with a physician, according to a survey published by the National Council for Mental Wellbeing.  Although California passed a law requiring private health plans in the state to provide follow-up mental health appointments to patients within 10 business days of a previous session, wait times nationwide can average between five and six weeks

More than two in five Americans who needed mental health care in the past 12 months did not receive it, and 67% said finding mental health care is harder than connecting with a physician. | Adobe Stock

Psychiatrist and Youper CEO Jose Hamilton co-founded the Market Street company back in 2017 to help close that gap. Hamilton’s patients often told him that their first visits were years in the making.  The average delay between the onset of mental health symptoms and treatment is 11 years, according to the National Alliance on Mental Illness.

“Most of the people, they were struggling, suffering in silence,” Hamilton said.

Hamilton wondered if there was a way to turn 10 years of waiting into 10 days or even 10 minutes, and eventually turned to AI to develop a subscription-based app that teaches mindfulness techniques and provides users with a variety mental wellness tools like Cognitive Behavioral Therapy, or CBT. 

When describing how the technology works, Hamilton likens it to a library that the AI sorts through to find the best technique for the problem at hand, whether its relationship issues or fear of public speaking. Plug in your mood, rate its intensit, and then a chat bubble will pop up asking you why you feel that way and run through strategies for easing your mind.   

“The AI is understanding what the user is saying and trying to find the best technique for that need at that moment,” Hamilton explained. 

As he sees it, Youper is not a replacement for human-guided therapy. But it could be the first step in one’s mental health journey and part of a continuum of mental healthcare. 

“There is a journey for every person,” Hamilton said of the therapy process, noting that AI could serve as a “low cost, low barrier, low stigma” first step on the road to better mental wellness.

Jose Hamilton, Youper CEO and co-founder | Courtesy Youper

Therapy Brought to You By Silicon Valley

Woebot Health’s free AI therapy chatbot was created by former Stanford clinical research psychologist Alison Darcy with the goal of making mental health care more accessible. The bot’s makers, who affectionately call it Woebot, or Woe for short, see their product as a low-cost and easy-to-use supplement to human-based therapy—rather than a replacement.

Joe Gallagher, Woebot Health’s chief product officer, likens the bot to another member of an individual’s health care team—except this health care provider can be available at 2 in the morning or slot into your day when you have a few minutes.

Unlike other types of AI chatbots, Woebot doesn’t generate all of its responses in real time, Gallagher said. Rather, conversations on the app unfold like branches of tree, with the machine selecting talking points that have been created by a team of conversational designers overseen by clinicians. The AI kicks in from time to time to respond to simpler questions or clarify certain points.

“It's used very judiciously,” Gallagher explained.  

Ultimately, Gallagher hopes Woebot can be an approachable mental health aid that users feel comfortable opening up to. 

“We see ourselves as being adjunctive to mental health care,” Gallagher said. “There's lots of people who use this right now in conjunction with a therapist.”

Ethical Concerns

Woebot user Lee Preslan turned to the app after a crippling, work-induced anxiety attack three years ago forced him to finally seek therapy. “It all came to a crumble,” Preslan recalled. “I just laid in bed for four days. I felt just emotionless.

He couldn’t book an appointment with a therapist for months, so he went to the app store and discovered Woebot. Preslan found the app to be a helpful stopgap and ultimately started using it as a supplement between therapy sessions. Although he acknowledges that talking with a chatbot might sound a bit weird and doesn’t completely substitute the human aspect of therapy, he appreciated the bot’s application of CBT techniques and liked how personalized the conversations felt. 

“The responses that Woebot would have with me is very much similar to what a therapist would reply back,” Preslan said.

Woebot Health's website underscores its AI chatbot skills and features. | RJ Mickelson/The Standard

According to studies affiliated with Woebot Health, users can establish a bond with bot within three to five days, and young adults users saw a 22% reduction in depression symptoms in two weeks time.   

But some, like Jodi Halpern, a psychiatrist and professor of bioethics and medical humanities at UC Berkeley, have concerns. For instance, Halpern worries about the way these products are marketed—especially to a potential audience of young people.

Woebot Health, for instance, describes its product as a “reliable ally you can trust” and claims the bot can “form close relationships with users” and “respond with empathy” on the company’s website.  

“Even if people know it's AI, it's emotionally deceptive and disrespectful to use terms like ‘empathy,’ ‘relationship,’ and ‘trust’ for this,” Halpern said of AI chatbot branding generally. “I think that these technologies can offer us something that we need in a mental health crisis, but that they should be marketed differently.”

But Gallagher insists that Woebot Health isn’t trying to market its chatbot as human, nor is it meant to replace human empathy.  

“Woebot makes it clear from the start that it is not human, and Woebot never tries to be human,” Gallagher said in a statement. “The idea of being a reliable ally comes straight from our users: for them, Woebot is a private, non-judgemental resource.” 

Hamilton, Youper’s CEO, explained that the company branded its app as an “emotional health assistant,” so that its product wouldn’t be confused for a therapist or a digital friend. 

Despite these assurances, Halpern is also worried about AI-based therapy chatbots targeting young people. Citing a Harvard Busines School analysis of Woebot that described the app as potentially “sticky,” she expressed concern over the thought of mental health tech companies building children into their business model. 

“I think that helping youth have smart diaries and things that can help them is a good thing. But giving them a sense that this is where they get their empathy or their companionship is a manipulative thing,” Halpern said.

Jodi Halpern, a psychiatrist and professor of bioethics and medical humanities at UC Berkeley, participates in a forum discussion about the ethics of technology in medicine in January 2020. | Courtesy World Economic Forum/Sandra Blaser

Woebot Health is currently working on a product for adolscents ages 13 to 18 and told The Standard that it is engaging in controlled studies internally and with clinical partners on the product and that the Harvard Business School analysis was outdated.

“We’ve heard how this post-covid world is often quite challenging for adolescents. We hope that Woebot can become part of a wider ecosystem of tools that can be used to create a robust and comprehensive plan for any adolescent that needs help,” Gallagher said in a written statement.  

Hamilton insisted that Youper’s one year-free subscription program for low-income college students is intended to help make mental health care more accessible to underserved communities. 

Recent reports of Microsoft’s Bing AI chatbot veering into strange conversational territory have also raised broader concerns across the media about the potential of AI to manipulate and unsettle users with disturbing messages.   

Youper said that it has safeguards in place to protect users from AI misbehavior, and Woebot Health said that its chatbot, unlike Bing’s, is not generative, meaning that it does not construct responses on the fly. Privacy concerns have been raised about both companies, but both insist that they’ve made changes to tighten up security and take patient confidentiality seriously. Woebot Health said that it is HIPAA and GDPR compliant. Youper said that conversations between the app and users are encrypted, protected and not shared with others. 

Out in the Field

Ann Tran, a private practice pscychologist in San Francisco, doesn’t think artificial intelligence will put her out of a job anytime soon because she firmly believes that mental health recovery happens in relationship to others, not in isolation. She asserts that AI-based mental health apps, like social media, may make users feel more alone, even if the chatbot simulates real human correspondence, like dots hovering in a text bubble before a reply. 

“I think that is helpful,” Tran said, “but at the end of the day, you know it's just an app. There's like kind of nothing behind it.”

She also has more practical concerns. While an app may offer an easy way to track one’s mood or gain easier access to mental health services, ultimately it can’t force anyone to make meaningful changes in their lives.

Woebot Health’s AI therapy chatbot app | Adobe Stock

“It's like people's New Year's resolutions, right?” Tran said. “What happens in the gyms in February and March is people stop going because the intention is not quite there. The motivation is lacking. And so you can easily just stop checking the app.”

Woebot Health counters that it has not seen any studies nor received feedback that suggests that users feel more isolated or alone after using the app. The company also says that a good percentage of people come back and use Woebot over time and that 75% of messages take place outside of regular clinical business hours. 

Hamilton, meanwhile, asserts that the mission of Youper is to encourage users to apply therapy techniques to improve their lives, not incentivize a relationship with the app. He also believes the gym analogy to be a bit flawed because therapy attrition rates hover around 30% and retention rates for continuing therapy are quite low—around four sessions on average, he says.  

“The retention is a problem, even with human therapists,” Hamilton said. 

For some, having an AI-based therapy chatbot available in one’s pocket maybe be good enough to get by.

“A lot of people are freaked out by the whole bot aspect,” Preslan said, “and other people like me are just kind of like, ‘Hey, I'm going to roll with this, continue to use it if it works,’ and it works for me.” 

Christina Campodonico can be reached at christina@sfstandard.com