Should You Use ChatGPT For Therapy? What Experts Say.

“I began to think that I could build a therapist by AI using the API ChatGPT and configure it to comply with the specifications for the therapist,” she said. “This increases the availability of therapy, providing free and confidential therapy, AI, not a person, and eliminating the stigma by obtaining assistance for people who do not want to talk with a person.”

Theoretically, AI can be used to meet the growing need for mental health and lack of mental health specialists to satisfy these needs. “Access is just a matter of inconsistency in demand and proposal,” Ier said Buzzfeed News. “Technically, the supply of AI can be endless.”

In the study in 2021, published in the journal SSM Hopal Health, it included 50 103 adults, 95.6% of people reported at least one barrier For health care, for example, the inability to pay for it. People with mental health problems, in the same way, were especially affected by the barrier for healthcare, including expensesIN Lack of experts and StigmaField

In the study in 2017, people of color were especially susceptible to healthcare blocks As a result racial and ethnic differencesIN Including high level Stigma mental healthLanguage barriers, discrimination and lack of medical insurance.

One of the advantages of AI is that the program can translate into 95 languages In a matter of seconds.

“EM users from around the world, and since ChatGPT is translated into several languages, I noticed that people who use their native language to communicate with EM, which is very useful,” Brandle said.

Another advantage is that although AI cannot provide true emotional empathy, this also cannot judge you, Brendla said.

“AI, as a rule, is not insignificant from my experience, and this opens up a philosophical door for the complexity of human nature,” Brendl said. “Although the therapist is impartial, like people, we tend to be in any case.”

That's when AI should not be used as an option

Nevertheless, mental health experts warn that AI can do more harm than the benefits for people who are looking for deeper information that need drugs or crisis.

“The presence of predictable control over these models of AI is what is still working, and therefore we do not know what unintentional methods of the AI ​​system can make catastrophic mistakes,” Air said. “Since these systems do not know the truth from the false or good from the bad, but simply tell you that they read earlier, it is quite possible that the AI ​​systems have read something inappropriate and harmful and repeating this harmful content to those who are looking for help. It is too early to fully understand the risks here. ”

People in Tiktok also say that it is necessary to make adjustments to the online tool – for example, a chat of artificial intelligence can give more useful reviews in his answers, they say.

“Chatgpt often reluctantly gives the final answer or will make a judgment about the situation that the human therapist can provide,” Kyla said. “In addition, ChatGPT is somewhat unable to ensure a new perspective of the situation that the user could lose sight of before the human therapist could see.”

Although some psychiatrists believe that ChatGPT can be a useful way to learn more about drugs, this should not be the only step in treatment.

“It is possible that it is better to consider the question of asking ChatgPT about drugs, such as you would look for information about Wikipedia,” Tours said. “The search for the right medicine is compliance with its needs and body, and neither Wikipedia, nor Chatgpt can do it right now. But you can learn more about the drugs in general so that you can make a more informed decision later. ”

There are other alternatives, including calls 988IN Free hotline of crisisCrisis Hotlines have a field of calls and messaging options available to people who cannot find mental health resources in their field or do not have financial resources for personally. In addition, there is Trevor Project project hotlineIN National Samma Emergency ServiceAnd otherField

“There are really excellent and affordable resources, such as 988 calls to help that are good options in the crisis,” Tours said. “The use of these chat bots during the crisis is not recommended, since you do not want to rely on anything unverified and not even intended to help when you need help most.”

Mental health protection experts, with whom we talked, said, therapy of AI can be a useful tool for ventilation emotions, but until more improvements are made, it cannot surpass people -experts.

“Right now, programs such as ChatGPT are not a viable option for those who are looking for free therapy. They can offer some basic support, which is excellent, but not clinical support, ”said Tours. “Even Catgpt manufacturers and related programs are very clear so as not to use them for therapy right now.”

Dial 988 in the United States to get to National Rescue Line for Suicide PreventionField The project of TrevorWhich provides assistance in providing assistance and suicide for LGBT-earners, is 1-866-488-7386. Find other international suicides in BeriFrieders Worldwide (Berixienders Worldwide (Befrienders.org)

Leave a Comment