Considerations For The Use Of Artificial Intelligence In Mental Health

By Jennifer Fiorillo
Mental Health

Jennifer Fiorillo

Artificial intelligence has become increasingly more present in our lives over the last decade. We likely encounter it in our day-to-day while in grocery stores, restaurants, accessing customer service or using a trending AI writing tool. The world of AI is complex and evolving at an accelerated pace, and some would argue that it is replacing human work in a number of vocations.

Despite some of the concerns surrounding the responsible and ethical use of AI, there have been a number of studies highlighting developments that can potentially support how we deliver and manage health care, including mental health.

There are many arguments for the use of AI in mental health that include the need to more efficiently diagnose and treat mental health issues given the rapidly growing demand for affordable treatment options and adequate access. One study published by the Lancet Psychiatry in 2022 showed a 48 percent increase in the diagnoses of 12 mental disorders from 1990-2019 across 204 countries, underscoring a need that has been exacerbated by the COVID-19 pandemic and behavioral health workforce shortage after 2020.

The concept of using AI powered chatbots through mobile applications to provide screenings and support for mental health issues is gaining traction as a way to engage, collect and analyze responses around stress, mood, energy levels and other symptoms. Chatbots use machine learning and natural language processing technologies that allow for tailored responses based on an individual’s needs through the simulation of human conversation. They can recommend and provide different types of treatment based on responses, engage in talk therapy and provide alerts when there are changes in moods that might require human intervention.

While opening up more affordable access to mental health treatment is one intent of chatbots, there are limitations and ethical concerns related to the use of this technology.

First, chatbots are not human and they don’t have the ability to respond empathetically or process human emotions in the ways necessary to deliver the most effective and appropriate mental health treatment.

Second, individuals with more complex and chronic mental health conditions require higher-level intervention and support that chatbots are not necessarily equipped to provide for the most successful recovery.

There have also been concerns raised around bias in the development of the algorithms that are used to program chatbots that could lead to harmful advice and treatment.

Researchers have been looking at natural language processing to help analyze how text-based human facilitated mental health counseling impacts patient satisfaction and clinical outcomes. One study published in January by the JAMA Network Open analyzed more than 20 million text counseling sessions from the mobile therapy application Talkspace to predict clinical outcomes, consumer satisfaction and engagement. This study found that there was a correlation between consumer satisfaction and clinical outcomes with the use of empathy and supportive counseling in treatment. The study illustrates the use of AI in analyzing these therapy sessions in a scalable fashion, which traditionally is a more labor-intensive process using observation and coding systems by evaluators to assess quality of treatment.

The above examples of the use of AI in mental health only scratch the surface of how it has been deployed to provide or support behavioral health treatment. While the delivery of mental health services using AI versus a live person can raise many questions and concerns around efficacy and bias, there might be room to further explore how it can enhance and aide in the delivery of care by reducing administrative burdens, analyzing trends and helping to advance treatment that supports human practitioners.

Jennifer Fiorillo, MBA, MPH is the president and CEO of Bridges Healthcare in Milford, and may be reached at Jfiorillo@bridgesmilford.org.

,