Behavioral and Mental Health
Want to learn more about this at Kettering Health?
At a Glance
Q: Is AI therapy safe and effective for mental health?
A: AI can provide basic guidance, but it cannot diagnose, treat, or ensure privacy like a licensed therapist.
- AI chatbots are not FDA-approved for treatment.
- Data privacy with AI remains uncertain.
- Learn why human connection matters in therapy.
Artificial intelligence (AI) has steadily risen in popularity over the past few years, sparking curiosity and debate over its safety and uses. And while some people are avoiding it, others are diving into the possibilities.
In a recent podcast interview, Mark Zuckerberg, CEO of Meta and co-founder of Facebook, discussed our human need for companionship and how, eventually, AI might be able to help fill the gaps.
But should it?
Julie Manuel, clinical program manager of Kettering Health Behavioral Medical Center, has some insight into AI companionshipโand what it means for therapy.
The human experience
โHumans are wired for connection,โ says Julie. We are social beings who require social connection as a basic human need.
Thatโs part of why traditional talk therapy can be so effective. According to Julie, traditional therapy can provide compassionate conversations that relate to the human experience. A trained therapist, whether by telehealth or in person, can do more than just listen.
โSitting with someone, we can read body language. We can have a different tone,โ she says. โWhereas with that AI therapy, it really does not create that conducive environment to a healing journey for that individual.โ
When people partake in therapy with an AI chatbot, they may often find cold, disconnected responses. Julie says, โRemember, most of their responses are going to be automated because itโs a computer. It canโt understand human emotion. It canโt build personal connections.โ
Itโs also important to remember that these AI chatbots are not licensed professionals. Currently, no AI chatbots have been approved by the U.S. Food and Drug Administration to diagnose, treat, or cure a mental health disorder.
Is your personal information safe?
There are still many unknowns about AI, and an important one concerns your data.
Professional, licensed therapists are required to maintain confidentiality under HIPAA standards. If therapists break confidentiality, they face professional and legal consequences. However, AI chatbots are not held to the same standards. In fact, there is still much debate on how exactly to hold an AI chatbot accountable to HIPAA.
โWhen you enter into a chatroom for therapy, youโre asked to provide some personal information,โ says Julie, โAnd unfortunately, what we donโt know is where itโs stored, how much of that is stored, or how much of that is shared.โ
Although AI has its uses, Julie urges caution. โIt can be a real risk to the individual because they do not comply with some of the same standards that we have as traditional therapists.โ
Itโs just a tool
AI can be a very beneficial tool, but, as Julie says, โItโs just that. Itโs just a tool we put in our toolbox to use if we need it.โ
Rather than using AI in place of traditional therapy, Julie recommends using it as a starting point or a tool to help you find resources.
โIt might be an easy way to get some information on how to start a really difficult conversation with someone, or maybe just some advice on how to deal with something,โ she says. โBut it definitely does not replace that connection that, as humans, we really need.โ
If youโre struggling with your mental health or a disorder like anxiety and depression, reach out to a behavioral health professional. Or, in extreme cases, go to your nearest emergency department.