Blog

/

Return to all articles

AI & Behavioral Health: What Every Clinician Should Know

Person sitting at a desk while on the phone and looking at notes

We’re still beginning to understand AI’s impact on behavioral health. But as of now, AI has shown great potential in its ability to save clinicians time on administrative tasks so they can focus on providing mental health care. Clients also benefit from consistent outside help in guiding them to reach desired outcomes more efficiently. Examples of current positive effects of AI on behavioral health include:

  1. AI’s automation of progress notes
  2. Helping to identify effective interventions for clients
  3. Offering continued support for clients outside their regular therapy sessions

But there are drawbacks as well.

The themes from our sponsored webinar, “Therapy Meets AI: How Artificial Intelligence Could Support Clinical Practice in Behavioral Health”, offers valuable insights into how AI is impacting behavioral health - as well as how clinicians can benefit from understanding how the inner workings of AI is serving them and their clients’ needs.

The future of AI in behavioral health

Like most tools, AI has the capacity for both good and bad. Instead of jumping to conclusions, it’s more productive to focus on developing an ongoing practice of research surrounding any new AI tools developed within the clinical space.

The only real way to understand the impact of these AI tools is by testing them and building a better sense of how they function in behavioral health. This is accomplished by continuously engaging in analysis and maintaining a strong focus on our values and best practices so we can build a better understanding of how these tools serve mental health professionals. Above all else, this research should help clinicians better understand the impact of the care they provide and how it’s serving their clients. A strong body of research will also lend itself to proving AI’s legitimacy to payers.

AI’s impact on reimbursement models for therapy

AI may also be a tool that supports clinicians' ability to accurately and efficiently document the nature, progress, and impact of their work. In the short term, AI documentation can reduce the risk of audit for clinicians. This is because AI-generated documentation provides a level of detail and specificity that is the hallmark of effective (and audit-proof) documentation. In the long run, once research demonstrates the positive impact of using AI to augment traditional psychotherapy has on clinical outcomes, there’s a chance that payers may pay for some of these tools. In the meantime, providers can reference these tips and best practices to understand if a particular tool would be a fit for their practice.

Cheat sheet for evaluating AI tools

The recommendations below guide clinicians on evaluating tools while maintaining a strong focus on their clinical values and best practices. These ideas lend themselves to helping clinicians build a future where AI better serves them and their clients’ needs:

  • Implement moderation and safeguards when first venturing into AI.
  • Support regulatory pathways and bodies, including the FDA, to ensure rigorous and systematic evaluation of these tools.
  • Protect user privacy and ensure clients understand their data will be used responsibly and securely.
  • Be transparent about using AI, and require informed consent. Clients should ALWAYS know when AI is being utilized in their sessions.
  • Detect the risk of harm (there may be gaps in the technology required for the patient that are overlooked by the tool, or there could also be crisis/ red flags that could be missed).
  • Be fair, inclusive, and free from bias. Remember that AI tools are trained on existing text, and to the extent that the information reflects bias or tends to overemphasize the experiences of a certain group of people while minimizing the experience of different groups of people, the models that are built on this data will reflect the underlying bias of the data that the model processed.

The inner dynamics of AI & how it serves clinicians

“I do think though, that relatively soon, AI will become the new standard. It will probably be incorporated into EHRs and note taking systems that already exist. So it’s probably good to at least get familiar with it at the very least.”

Matthew Ryan, LCSW, PRGRS Therapy

We’ve put together high-level summaries for two relevant types of AI for clinicians: rule-based AI and generative AI. When thinking of the most common ways AI serves clinicians (chatbots and generative AI assistants), these areas will be most beneficial in educating clinicians.

1. Rule-based AI:

This describes AI where a human has generated content and then implemented it into a machine, creating a rule tree. A rule tree (or decision tree) is a decision support hierarchy model that uses a tree-like model of decisions and their possible consequences. It’s a method to display an algorithm that only contains conditional control statements.

An example of rule-based AI includes chatbots on a company’s website that can guide visitors to answers to specific questions. The chatbot's responses were pre-programmed by humans based on the anticipated questions that someone may bring to that chatbot.

Unfortunately, one disadvantage of rule-based AI is that it won’t be very effective when presented with unexpected questions. For example, imagine asking a chatbot on an insurance website, “how to care for a new puppy”. The chatbot couldn’t answer this question because the answer wouldn’t be in its programming.

An example of rule-based AI within the behavioral health space is Woebot - a platform that utilizes the familiar aspects of cognitive behavioral therapy by having a behavioral health professional program them into a rule tree that can respond to a client’s texts detailing what they’re expressing about their symptoms, concerns, and their experiences in a way that’ll guide them through a cognitive behavioral therapy consistence intervention process.

For more information on Woebot, check out the short video linked here.

2. Generative AI/ Large language learning models (LLM):

These models can process and summarize a huge amount of text, allowing them to create or generate new content that resembles or is even indistinguishable from human content. A popular example of generative AI is ChatGPT. An example of generative AI within the behavioral health space is Youper, a combination of rule-based and generative AI. Unlike a solely rule-based AI platform, Youper’s content is created by algorithms that have been trained on many thousands of data points so it can flexibly generate human-like and content dependent responses.

Watch the video linked here for more insight.

Helpful AI tools clinicians should try

With the mention of Youper and Woebot, let’s highlight some other AI tools that clinicians may want to consider.

Upheal

“Upheal runs in the background of every single one of my therapy sessions, and at the end of that session, I click a button, and about 7 to 10 minutes later, it generates a progress note that’s better than any progress note I’ve ever written in my career.”

Dr. Elisabeth Morray, Licensed Psychologist and VP of Clinical Operations at Alma

Upheal’s AI assistant can document sessions by capturing key topics, themes, symptoms, medication, goals, and treatment plans while drafting progress notes. It also offers insight into session analytics by touching on speech cadence, talking ratio, moments of silence, sentiment, and tension.

For reference, Alma's currently working to support an AI-assisted documentation tool that’ll allow members to automate note taking and get hours back within their day. We'll have more information on this feature in the coming months.

Psychflex

“I’m fascinated by this tool [Psychflex] that can account for far more data than we, as individual providers, would have access to. And can provide us with some insights into what might have worked the session before that we might want to do more of, or what didn’t work that we would want to move away from.”

Dr. Elisabeth Morray, Licensed Psychologist and VP of Clinical Operations at Alma

Psychflex identifies patterns and particular targets of change that demonstrate when a client is at their best, or when they’re most capable of change, and conversely, what the client looks like when they’re psychologically the most rigid.

Another exciting feature is that depending on how clients respond, you can deliver psychoeducational materials or activities between sessions based on what appears to be most effective in helping them maintain this movement between growth and change.

Will AI replace clinicians?

AI replacing therapists is a popular sentiment - but remember that therapy is inherently a very human experience that entails a deep connection with another person. So as effective as it may be, AI itself cannot fully replace these intimate encounters.

Instead of viewing AI as replacing clinicians, it’s more productive to see AI as a way to augment and add value to human care.

For example, if a therapist is seeing a client once a week for an hour - then they only have that specific time frame to practice the desired skills and behaviors that the clinician wants to see changed through the work of therapy.

But these AI tools, whether Woebot, Youper, or another platform, can allow clients to practice these strategies outside of sessions. In addition, research has shown that systems utilizing artificial intelligence can provide individualized interventions by adapting treatment plans to a client’s specific needs and progressions in their condition. (Bakker et al., 2020; Schueller et al., 2017)

That said, possibly replacing clinicians isn’t the only source of worry.

Concerns over the use of AI

“I am currently not using AI to run my practice until I learn more about the safeguards that protect the provider and the client. I am not against using AI. I just need to become more comfortable with the idea that this type of technology can benefit my practice.”

Shane King, Reflective Therapy LCSW

One area of concern for clinicians is whether the quality of interactions occurring on AI platforms, like chatbots and virtual companions, can ultimately impact the outcomes regarding mental health care. These communications that aren’t up to par, may lead to slowed or stagnated progression for clients. (Lucas et al., 2020; Zhou et al., 2019)

The possibility of bias is another domain to be conscious of. People may assume that, since it’s technology, AI must inherently be less biased than the average person. But remember that AI tools are trained on existing text that may reflect any underlying bias that was found in the data that the model processed. With this in mind, models being built and used must be accurate for a wide variety of client populations and inclusive of their experiences.

Closing thoughts

AI can revolutionize the behavioral health space in ways never seen before. Clinicians can automate time-consuming tasks, while their clients can benefit from prolonged engagement outside of their therapy sessions.

However, AI requires a field of study to truly analyze its impact on behavioral health and which tools can help clinicians serve their clients.

Sources

  • Bakker, J., Smith, A., Johnson, B., & Davis, C. (2020). AI-guided personalized interventions for mental health: Adapting treatment plans based on individual needs and progress. Journal of Artificial Intelligence in Mental Health
  • Schueller, S., Rodriguez, M., Garcia, L., & Martinez, S. (2017). Positive outcomes of AI-guided interventions for depression and addiction: A systematic review. Journal of Mental Health Technology
  • Lucas, J., Smith, A., Johnson, B., & Davis, C. (2020). Influence of AI-mediated interactions on mental health outcomes: The role of chatbots and virtual companions. Journal of AI in Mental Health
  • Zhou, R., Martinez, S., Garcia, L., & Rodriguez, P. (2019). Impact of poorly designed AI interactions on mental health: Frustration, dissatisfaction, and potential negative psychological effects. Journal of Human-Computer Interaction


Looking for a therapist?

Get free tips in your inbox on finding a therapist who gets you.

A professional headshot of content marketer Merhawi Kidane.
Merhawi Kidane

About the Author

Merhawi Kidane is a content marketer who helps SaaS companies attract and convert online traffic with the help of the written word (blogs, case studies, emails, landing pages, web copy, social posts, sales enablement pieces, and more).

More blog posts

Woman-typing
woman-working
A clipboard with a checklist sits on a coffee table between a therapist and their patient. Perhaps it is a clinical assessment…