Why experts think AI won’t replace physicians in mental health

Artificial intelligence is just as good as humans in identifying red-flag language in text messages from people with mental illness, a recent study found. But with many clinicians worried about being replaced by AI, should they be concerned about these findings?

Replacement won’t be an issue, said Justin Tauscher, the lead author of the above-mentioned University of Washington Medicine study. If anything, AI tools are there to support clinicians, not replace them.

“The hope is that these tools and potential interventions that might be developed … will help give clinicians more insight into how they might be able to best support their work,” Tauscher said in an interview. “So rather than replacing clinicians or changing how their work is done, it’s just adding to the work that can be done and adding to the tools that somebody has available to them.”

Another industry expert agreed with Tauscher’s comments. AI is simply a support tool for clinicians, said Robin Farmanfarmaian, an entrepreneur who works with AI startups and author of “How AI Can Democratize Healthcare.”

“AI is never going to replace doctors,” she said in an interview. “It’s doctors who use AI who will replace doctors who don’t. AI is a tool … the app can do one thing extremely well. So in the case of the study, it can only detect those few things that it was looking for, whereas the therapist is looking at so much more.”

The study, published in the Journal of Psychiatric Services, collected thousands of text messages from 39 patients in a randomized controlled trial over 12 weeks. Clinicians evaluated the texts for cognitive distortions, or thoughts that increase depression and anxiety. The researchers also used natural language processing in the study, meaning they programmed computers to identify cognitive distortions. Specifically, the clinicians and computers searched for five cognitive distortion types in language: mental filtering, jumping to conclusions, catastrophizing, “should” statements and overgeneralizing.

This is the first study to use natural language processing for text messages between patients with serious mental illness and clinicians, Tauscher claimed.

“This was the first, at least to our knowledge, attempt at training a classifier in back-and-forth text messages between individuals with serious mental illness and their clinicians for cognitive distortions … Just kind of knowing that we can leverage some techniques from natural language processing to get some more insight about what’s going on between a patient and their clinician and the therapeutic intervention is really exciting.”

Text messages have become a prevalent part of mental health treatment, Tauscher said. The use started in a more straightforward way, such as appointment reminders, medication reminders and support statements, he said. Eventually, text messaging in mental health settings grew to actual therapeutic conversations between clinicians and patients, including informing people of mental health resources in the community or checking in on clients between sessions.

“We’re learning a lot more about how to use text messages to actually deliver therapeutic interventions,” Tauscher said. “So challenging thoughts, relaxing, identifying cognitive distortions. Actual back-and-forth text messaging has been shown to be useful in this way and some clients actually really enjoy this sort of methodology or this modality because it’s convenient. They don’t have to necessarily come into an office for this and there’s something about the not face-to-face nature that’s helpful for some clients.”

Now that the researchers have these findings, they want to expand their work. In the future, they want to test their model on a different group of clinicians and patients, as well as search for other examples of cognitive distortion, Tauscher said.

But most importantly, Tauscher wants to show clinicians how technology can support and improve their work, not replace them.

“In the mental health field, language is one of the most important ways that we provide care and it carries a lot of information that can be used to inform how we deliver our services … We can use advances from technology, from natural language processing and artificial intelligence, to capture the information that’s being conveyed in our language to improve the way that mental healthcare is delivered for folks over time,” Tauscher said.

Photo: metamorworks, Getty Images


Leave a Comment

Your email address will not be published.