Behavioral and Mental Health
Want to learn more about this at Kettering Health?
Artificial intelligence (AI) has steadily risen in popularity over the past few years, sparking curiosity and debate over its safety and uses. And while some people are avoiding it, others are diving into the possibilities.
In a recent podcast interview, Mark Zuckerberg, CEO of Meta and co-founder of Facebook, discussed our human need for companionship and how, eventually, AI might be able to help fill the gaps.
But should it?
Julie Manuel, clinical program manager of Kettering Health Behavioral Medical Center, has some insight into AI companionship—and what it means for therapy.
The human experience
“Humans are wired for connection,” says Julie. We are social beings who require social connection as a basic human need.
That’s part of why traditional talk therapy can be so effective. According to Julie, traditional therapy can provide compassionate conversations that relate to the human experience. A trained therapist, whether by telehealth or in person, can do more than just listen.
“Sitting with someone, we can read body language. We can have a different tone,” she says. “Whereas with that AI therapy, it really does not create that conducive environment to a healing journey for that individual.”
When people partake in therapy with an AI chatbot, they may often find cold, disconnected responses. Julie says, “Remember, most of their responses are going to be automated because it’s a computer. It can’t understand human emotion. It can’t build personal connections.”
It’s also important to remember that these AI chatbots are not licensed professionals. Currently, no AI chatbots have been approved by the U.S. Food and Drug Administration to diagnose, treat, or cure a mental health disorder.
Is your personal information safe?
There are still many unknowns about AI, and an important one concerns your data.
Professional, licensed therapists are required to maintain confidentiality under HIPAA standards. If therapists break confidentiality, they face professional and legal consequences. However, AI chatbots are not held to the same standards. In fact, there is still much debate on how exactly to hold an AI chatbot accountable to HIPAA.
“When you enter into a chatroom for therapy, you’re asked to provide some personal information,” says Julie, “And unfortunately, what we don’t know is where it’s stored, how much of that is stored, or how much of that is shared.”
Although AI has its uses, Julie urges caution. “It can be a real risk to the individual because they do not comply with some of the same standards that we have as traditional therapists.”
It’s just a tool
AI can be a very beneficial tool, but, as Julie says, “It’s just that. It’s just a tool we put in our toolbox to use if we need it.”
Rather than using AI in place of traditional therapy, Julie recommends using it as a starting point or a tool to help you find resources.
“It might be an easy way to get some information on how to start a really difficult conversation with someone, or maybe just some advice on how to deal with something,” she says. “But it definitely does not replace that connection that, as humans, we really need.”
If you’re struggling with your mental health or a disorder like anxiety and depression, reach out to a behavioral health professional. Or, in extreme cases, go to your nearest emergency department.