Suzanne BernTechnological reporter
Getty imagesEarlier this year, Rachel wanted to clear the air with the person he met before seeing him again in a wider group of friendship.
“I used ChatGPT to search for work, but I heard how someone else used it [for dating advice]- says Rachel, who does not want her real name to be used, and lives in Sheffield.
“I felt quite upset and wanted the leadership and did not want my friends to be involved.”
Before the phone call, she turned to ChatGPT for help. “I asked how to deal with this conversation, but not to be in defense.”
His answer?
“Chatgpt does this all the time, but it was something like“ wow, it is such a self-paid question, you should be emotionally mature, passing through it. Here are some tips. ” It was like a fan on my side, as if I was right, and he was wrong. “
In general, she says that it was “useful”, but she described the language as “very similar to therapy, using words such as“ borders ”.
“All that I took from this reminded me to be in order to do it on my conditions, but I did not take it literally.”
Rachel is not alone, turning to AI for advice on relations.
According to the study Match of the company of online knowledgeAlmost half of the Americans of generation Z (those born from 1997 to 2012) said that they used LLM, such as ChatGPT for dating tips, this is more than any other generation.
People turn to AI to help the message break, analyze the conversations that they conduct with the people they meet, and solve problems in relations.
Anastasia JobsonDr. Lalitaa Sagli, a psychologist and an expert on relationships, says that AI can be a useful tool, especially for people who feel depressed or uncertain when it comes to talking in relationships.
This can help them create a text, process a confusing message or get a second opinion that can offer the moment a pause instead of being reactive, she says.
“In many ways, it can function as a hint for a magazine or a reflective space that can be supported when using it as a tool, and not a replacement of connection,” says Dr. Sagli.
However, she notes several problems.
“LLM is trained to be useful and pleasant and repeat what you share, so they can subtly check dysfunctional patterns or echo, especially if the hint is prejudicized, and the problem with this can strengthen the distorted narratives or avoiding trends.”
For example, she says that the use of AI to write a rupture text may be a way to avoid discomfort in the situation. This can contribute to avoiding behavior, because a person does not sit with what he actually feels.
The use of AI can also impede their own development.
“If someone turns to LLM every time he is not sure how to react or feel emotionally exposed, he can start outsourcing his intuition, emotional language and a sense of attitude to attitude,” says Dr. Sagli.
She also notes that AI messages can be emotionally sterile and make communication feel like a script that may be concerned.
Is itDespite the problems, services are acting to serve the market for consultations on relations.
Mei is a free service generated by AI. Trained using open AI, the service reacts to dilemmas of relations with conversational answers.
“The idea is to allow people to instantly turn for help to navigate in relations, because not everyone can talk with friends or family because of a fear of judgment,” says the New York founder of Es Lee.
According to G -on, he says that he says that more than half of the issues arose regarding the artificial intelligence tool, an object that many may not want to discuss with friends or therapist.
“People use only AI, since there are not enough existing services,” he says.
Another common use is how to rephrase a message or how to solve a problem in a relationship. “As if people need AI to confirm this [the problem]. “
When providing relationships on relationships, security issues may arise. The human consultant learns when to intervene and protect the client from a potentially harmful situation.
Will the application for relations provide the same fences?
Mr. Lee recognizes the care of security. “I think that bets are higher than AI, because they can contact us at a personal level as no other technology.”
But he says that May’s “fences” are built in artificial intelligence.
“We welcome professionals and organizations in partnerships with us and actively participate in the formation of our AI products,” he says.
Openai creator Catgpt says that its last model showed improvements in areas such as avoiding unhealthy levels of emotional dependence and sycophane.
In the company's statement said:
“People sometimes turn to ChatGPT at confidential moments, so we want to make sure that it is properly responsible for, it is guided by experts. This includes the direction of people to professional assistance, when necessary, strengthens our guarantees in how our models respond to sensitive requests and pushing to break during long sessions. ”
Another area of anxiety is confidentiality. Such applications can potentially collect very confidential data that can be destructive if they are exposed to hackers.
Whether he says: “On each fork on the way to how we process the confidentiality of users, we choose the one that retains confidentiality and collects only what we need to provide the best services.”
Within the framework of this policy, he says that Mei does not ask for information that will identify a person except the email address.
Mr. Lee also says that the conversations are temporarily preserved to ensure quality, but discarded after 30 days. “At present, they are not preserved forever in what kind of database.”
Some people use AI in combination with a human therapist.
When Corinn (not her real name) wanted to put an end to relations at the end of last year, she began to contact Chatgpt for advice on how to deal with it.
London Corinn says that she was inspired to turn into AI, hearing her house neighbor positively spoke of his use for acquaintance, including how to part with someone.
She said that she would ask this to answer her questions in the same style as the popular expert on relations between Gillian Turek or holistic psychologist Dr. Nicole Leperra, both very popular on social networks.
When she began to meet at the beginning of the year again, she turned to him again, again asking for advice in the style of her favorite experts on relations.
“Around January, I was on a date with a guy, and I did not find him physically attractive, but we get along very well, so I asked if there was another date. I knew that they would say yes, reading their books, but it was nice to have advice adapted to my scenario. ”
Corinn, who has a therapist, says that the discussions with her therapist are more deepened into childhood than the questions that she raises with ChatGPT about dating or requests of relations.
She says that she belongs to the advice of AI with a “slightly distance”.
“I can imagine that people are completing relationships and, possibly, conduct conversations, they should not yet have [with their partner] Since Chatgpt simply repeats what, in his opinion, you want to hear.
“This is good in stressful moments of life. And when there is no friend. It soothes me. “







