Person using chatbotSince its release in fall 2022 by OpenAI, ChatGPT—a free artificial intelligence chatbot—is being used by the masses in an ever-growing number of creative ways. From tweaking a resume to crafting travel itineraries, this technology can do it all in seconds, with the user only having to input a few simple prompts.

Many are now discovering that ChatGPT’s capabilities go beyond job applications and finding the top spots for a European getaway. They’re turning to the chatbot for therapy, sharing with it feelings of anxiety and depression, relationship struggles and more, and heeding the guidance it fires back. These ChatGPT users are taking to social media to praise the technology, claiming that they no longer need a human therapist.

According to Anna Zacharcenko, PsyD, a clinical health psychologist and the director of the behavioral health education residency program at St. Mary Family Medicine Bensalem, the belief that AI technology can replace a living, breathing therapist is detrimental for several reasons: inaccurate diagnoses, patient confidentiality and biases in the data set.

ChatGPT’s ability to give a proper diagnosis can only go so far. This is because AI’s knowledge of a person is limited to what they choose to share with it via text or voice, preventing it from getting the full picture. In fact, the chatbot has a warning that reads, “ChatGPT can make mistakes. Check important info.”

Since ChatGPT is still in its infancy, Dr. Zacharcenko has concerns about patient privacy. If there were to be a data breach, confidential and deeply personal information that individuals shared with the chatbot could be leaked and used in nefarious ways.

“A patient could very well report, ‘I’m not sad. I’m very content with my life.’ They can deny anxiety. And yet as a clinician, when the patient is in front of us, we can see their nonverbal cues. As human beings, we get to see the inconsistency,” says Dr. Zacharcenko. “A face-to-face encounter with the patient allows the provider to observe critical nonverbal cues and behaviors, as well as a patient’s overall appearance, affect, mannerisms, etc., which can inform proper diagnosis.”

Since ChatGPT is still in its infancy, Dr. Zacharcenko has concerns about patient privacy. If there were to be a data breach, confidential and deeply personal information that individuals shared with the chatbot could be leaked and used in nefarious ways.

Additionally, such AI programs could have biases in their programming against certain demographics, unbeknownst to users.

“With respect to accurate diagnosis and the appropriate clinically indicated treatment, the AI tool or program being utilized is only as good, reliable and valid as the developer—and the data set—behind it,” says Dr. Zacharcenko.

Using AI as a supplement to in-person therapy

Still, the growing link between AI and mental health isn’t without benefits. In fact, adds Dr. Zacharcenko, AI can be a very valuable educational tool in the field of psychology when it’s used as an enhancement to—rather than replacement of—in-person clinical treatment.

For example, patients struggling with mood regulation or social anxiety could benefit from personalized, psychologist-approved modules that help them examine their thoughts, feelings and behaviors on their own time, almost like a homework assignment. Their responses would then be discussed with their therapist, who can recommend proper interventions.

Such modules, explains Dr. Zacharcenko, are especially useful when helping children develop healthy emotional regulation, which is correlated with improved academic outcomes and increased success in navigating life’s challenges.

“Programmed instruction can be utilized to engage in learning to identify emotional states and the physiological/bodily cues which accompany emotional states,” she says. “So perhaps by displaying photos of an angry face, a sad face, an excited face, we can teach children how to identify and label these emotions more accurately.”

The effects of becoming reliant on AI

Beyond using ChatGPT as a full-time therapist, many—especially younger generations—are becoming reliant on the AI chatbot for simple tasks like drafting a quick email or text. Turning to AI for such menial things can have negative effects in the long run.

“My concern is a lack of confidence in one’s ability to think critically and to write eloquently in a grammatically correct way. That can prevent someone from fully developing their skills and what they’re capable of. You can get confident in how you use AI, but do you really know that it’s you?” says Dr. Zacharcenko. “It affects the ‘grit’ factor. It’s that persistence of, ‘I’m going to get this done. Here’s a task for me. I might initially feel overwhelmed or intimidated, but I’m going to break it down in such a way that I’m going to acquire this skill and master it’.”

Final thoughts

ChatGPT is a fascinating technology, with humans barely scratching the surface of its full capabilities. However, it is not a substitute for therapy. If an individual needs help, this shouldn’t be the first place they turn.

“Don’t be afraid to reach out to your primary care provider if you are experiencing symptoms of depression or anxiety, and you’re not enjoying life as much as you would like to,” says Dr. Zacharcenko. “A good place to start is to talk with one’s practitioner. They can lead you in the best direction.”

Learn more about St. Mary Family Medicine Bensalem.