In the rapidly evolving landscape of mental health care, artificial intelligence is emerging as a transformative force. Many social media users are turning to chatbots to discuss their problems and get feedback. For example, Taylor Nicioli turned to ChatGPT on her laptop to explore the AI bot’s therapeutic abilities.
Mya Dunham, 24, has been using the ChatGPT phone app for the last two months when she needs advice. She writes her feelings and seeks analysis and feedback from the bot about twice a week. “My goal is to learn a new perspective, just to have a different viewpoint on it.
Whatever I think in my head is going to be based off of my own feelings,” Dunham said. When Dunham shared her experience on TikTok, the responses were mixed. Some people also used chatbots for therapeutic purposes, while others felt uneasy about discussing personal issues with a robot.
“Some users might be more apt to open up when talking with an AI chatbot, and there’s some research supporting their efficacy in helping some populations with mild anxiety and mild depression,” explained Dr. Russell Fulmer, chair of the American Counseling Association’s Task Force on AI. However, Fulmer emphasizes using chatbots in collaboration with human counseling.
A therapist can help navigate a patient’s personal goals and clarify any misconceptions from the chatbot sessions.
Chatbots assist mental health therapy
Dr.
Marlynn Wei, a psychiatrist in New York City, pointed out the risks of using general chatbots. They might not have been designed with mental health in mind, and they might not have safety parameters for identifying severe issues that require a clinician. “Chatbots could give out incorrect information or just tell users what they want to hear instead of what a human therapist might recommend,” Wei said.
Dr. Daniel Kimmel, a psychiatrist at Columbia University, noted that while chatbots are good at sounding like a therapist, they lack the inquisitiveness of a human therapist who digs deeper into a patient’s issues. Chatbots might be more accessible for people without the financial means or time for traditional therapy.
Some mental health experts, like Fulmer, argue that in such cases, a chatbot might be preferable to nothing. However, Fulmer advises that vulnerable populations should not use chatbots without guidance. Privacy is another concern.
Conversations with professional therapists are covered by the Health Insurance Portability and Accountability Act (HIPAA), which ensures the privacy and protection of health information. However, chatbots often are not HIPAA-compliant, and the companies behind them may advise users not to share sensitive information. In summary, while AI chatbots can offer an accessible and approachable means for some users to explore their feelings, mental health experts recommend using them as a supplement to, not a replacement for, professional therapy.
It’s essential to understand the limitations and potential risks of relying solely on AI for mental health support.
Noah Nguyen is a multi-talented developer who brings a unique perspective to his craft. Initially a creative writing professor, he turned to Dev work for the ability to work remotely. He now lives in Seattle, spending time hiking and drinking craft beer with his fiancee.




















