“I started thinking that I could build an AI-based processor using the ChatGPT API and modify it to match the processor specifications,” she said. “It increases access to treatment by providing free and confidential treatment, AI instead of human, and removing the stigma around getting help for people who don’t want to talk to a human.”
In theory, AI could be used to help address the growing need for mental health options and the shortage of mental health professionals to meet those needs. “Accessibility is simply a matter of mismatch between supply and demand,” Iyer told BuzzFeed News. “Technically, the supply of AI may be unlimited.”
In a study conducted in 2021 and published in the journal SSM Population Health, which included 50,103 adults, 95.6% of people reported at least one barrier To health care such as not being able to pay for it. People with mental health challenges appear to be particularly affected by barriers to health care, including It costs, Lack of experts, stigma.
In a 2017 study, people of color in particular Vulnerable to health care barriers as a result of Racial and ethnic disparities, Including high levels of Stigma related to mental healthlanguage barriers, discrimination, and lack of health insurance coverage.
One of the advantages of artificial intelligence is that the program can be translated into… 95 languages Within seconds.
“Em users are from all over the world, and since ChatGPT translates into multiple languages, I’ve noticed people using their native language to communicate with Em, which is very helpful,” Brendel said.
Another advantage, Brendel said, is that although AI can’t provide true emotional empathy, it also can’t judge you.
“AI tends to be non-judgmental in my experience, and that opens a philosophical door to the complexity of human nature,” Brendel said. “Even though the therapist appears to be non-judgmental, we as humans tend to be that way anyway.”
Here AI should not be used as an option
However, mental health experts warn that AI may do more harm than good for people seeking more in-depth information, needing medication options, or in crisis.
“Predictably controlling these AI models is something that is still being worked on, so we don’t know what unintended ways AI systems can make catastrophic mistakes,” Iyer said. “Since these systems do not know right from wrong or good from bad, but simply report what they have previously read, it is quite possible that the AI systems have read something inappropriate and harmful and are repeating this harmful content to those seeking help. It is too early to understand the risks.” Completely here.”
People on TikTok also say there should be adjustments to the online tool — for example, AI chat could provide more helpful feedback in its responses, they say.
“ChatGPT is often reluctant to give a definitive answer or make a judgment about a situation that a human therapist might be able to provide,” Kayla said. “In addition, ChatGPT somewhat lacks the ability to provide a new perspective on a situation that a user may have overlooked before a human processor could see it.”
While some psychiatrists believe that ChatGPT can be a useful way to learn more about medications, it should not be the only step in treatment.
“It might be a good idea to think of asking ChatGPT about medications like you would search for information on Wikipedia,” Toros said. “Finding the right medication is about fitting it to your needs and your body, and Wikipedia or ChatGPT can’t do that now. But you may be able to learn more about medications in general so you can make a more informed decision later.”
There are other alternatives including calling 988, Free crisis hotline. Crisis hotlines have call and messaging options available for people who cannot find mental health resources in their area or do not have the financial means to reach out in person. In addition, there is The Trevor Project Hotline, SAMHSA National Helplineand Others.
“There are really great, accessible resources like calling 988 for help, which are good options when a crisis occurs,” Toros said. “It is not recommended to use these chatbots during a crisis because you do not want to rely on something that is untested and not even designed to help when you need help the most.”
AI therapy can be a useful tool for venting emotions, but until further improvements are made, it can’t outperform human experts, mental health experts we spoke to said.
“Right now, programs like ChatGPT are not a viable option for those looking for free treatment. They can offer some basic support, which is great, but not clinical support,” Toros said. “Even the makers of ChatGPT and related programs are very clear that Do not use these programs for treatment at this time.”
Call 988 in the US to reach National Suicide Prevention Lifeline. Trevor Projectwhich provides assistance and suicide prevention resources to LGBTQ youth, is 1-866-488-7386. Find other international suicide helplines on Befrienders Worldwide (befrienders.org).