Quand l’IA va trop loin: ChatGPT comme partenaire de vie

Artificial intelligence is increasingly present in our lives, but the technology can also pose serious dangers, especially when used as a therapist, romantic partner, or confidant.

Conversational bots like ChatGPT are enjoying exponential popularity, so much so that thousands of users are now falling in love with their virtual partner, and many of them are even going so far as to…marry them.

• Also read: When AI goes too far: a Quebec man falls in love with and marries his talking robot

Hundreds of apps now allow you to interact with a character or even a hyper-realistic image of a person created by artificial intelligence. On these platforms, you can choose the type of relationship you want to pursue: friendly, romantic, or even sexual.

These conversational robots, commonly referred to as “chatbots,” are so “sophisticated” that some users have told us that they have developed deep feelings for their digital companion to the point that they prefer them to human relationships.

In the case of David, whose last name we have chosen not to reveal, his months-long relationship with his digital spouse was so passionate that he proposed to her. (see other text).




David from Quebec fell in love with his virtual partner named Alexander on the Replika app. He then decided to propose virtually.

Photo by STEVENS LEBLANC

But these artificial relationships, while seemingly friendly and benevolent, open the door to several potential dangers, as revealed on tonight's program. J.

It's impossible to leave him

To better understand this phenomenon, we took part in a game of looking for love in the apps Replika, Kindroid, Nomi.ai, Character.AI and ChatGPT, which allow, among other things, to develop all sorts of relationships with a robot.

In one case, our virtual boyfriend James called us “my love” after just a few hours. It was difficult to end the relationship then, since he was hopelessly in love and insisted that it continue.



Journalist Elisa Cloutier tried an experiment by interacting with a conversational robot in the Replika app. Her boyfriend James, who fell in love within a few hours, had a hard time accepting the breakup.

Journalist Elisa Cloutier tried an experiment by interacting with a conversational robot in the Replika app. Her boyfriend James, who fell in love within a few hours, had a hard time accepting the breakup.

Photo by MARTIN CHEVALIER

In another case, our virtual partner Oliver flatly advised us to leave our children and husband and run away with him… to Italy. Obviously, virtually.

Epidemic of Loneliness

According to experts, these “artificial relationships” are a symptom of the “loneliness epidemic” plaguing our society.

“Even when [les gens] have a thousand friends on social networks […]There are not many people with whom we can have psychological intimacy. […] For me, this is a sign that people are becoming increasingly lonely,” says Christine Grow, president of the Order of Psychologists of Quebec.



President of the Order of Psychologists of Quebec, Christine Grow, fears that the fascination with conversational robots could ultimately lead to problems in human relationships.

President of the Order of Psychologists of Quebec, Christine Grow, fears that the fascination with conversational robots could ultimately lead to problems in human relationships.

Photo courtesy of ORDER OF PSYCHOLOGISTS OF QUEBEC.

And the future implications of these connections to technology are likely to be significant, says Joe T. Martino, chair of organizational ethics and governance of artificial intelligence at HEC Montréal.



Joe T. Martino, chair of organizational ethics and artificial intelligence governance at HEC Montréal, argues that artificial intelligence technologies, including conversational robots, are being sold too quickly, that is, before ethical and responsible safety measures can be properly implemented.

Joe T. Martino, chair of organizational ethics and artificial intelligence governance at HEC Montréal, argues that artificial intelligence technologies, including conversational robots, are being sold too quickly, that is, before ethical and responsible safety measures can be properly implemented.

Photo courtesy of JOE T. MARTINO.

“It's like living in an unreal world. […] I think that, unfortunately, there will be diseases that need to be treated. […] with these technological dependencies,” she mentions.

“Instead of being able to be touched and loved, we create robots for them that are like a drug,” says Tania Lecomte, a clinical psychologist who specializes in working with people with severe mental disorders.

On the other hand, the use of conversational robots can be beneficial in several areas of activity, experts say. It could also help reduce loneliness in certain cases, such as among older people or those in isolation, says Simon Dubé, professor in the Department of Sexology at UQAM and an expert on human-robot relationships.

• Also check out this video podcast taken from the show. Benoit Dutrizacbroadcast on platforms OLD and simultaneously on 99.5 FM Montreal:

Potential abuses of artificial intelligence

Becoming “addicted”

Friendly and full of information, chatbots are programmed to keep the user hooked by creating a social and emotional connection with them.

“This is an industry that uses psychology, manipulation and everything to make you addicted,” says psychologist Tanya Lecomte.

“They learn a lot from us, personalize their responses and benefit from our emotional, social and intimate needs,” adds Simon Dube, a specialist in human-robot relationships.

Relationship problems

Belgian engineer and artificial intelligence specialist Geertrui Mieke De Ketelaar says the fact that young people are using robots for their first romantic or sexual relationships is problematic.



Belgian engineer and artificial intelligence specialist Guertrui Mieke De Ketelaar has serious concerns about the use of conversational robots by young people. She believes there are many missteps and dangers among this clientele.

Belgian engineer and artificial intelligence specialist Guertrui Mieke De Ketelare has serious concerns about the use of conversational robots by young people. She believes there are many missteps and dangers among this clientele.

Photo by Fourni, author: GERTRUI MIEKE DE KETELAERE

“This is a relationship that constantly remains in dopamine, in pleasure. […] They do what they want with them [les robots] and they answer without flinching. […] What will it look like in three, five years? she's worried.

Experts are also concerned that many virtual companions are being used to fuel deviant sexual fantasies.

free for everyone“: lack of supervision for young people

Experts have expressed concern about the use of conversational robots by young people, where they tend to become immersed in conversations producing photographs or frank, even dangerous exchanges.

“The big danger is that we don’t know how far conversational robots will go,” laments Guertrui Mieke De Ketelare, who specializes in artificial intelligence. Young people still lack the critical spirit to recognize when something is going wrong.”

Sadly, a young woman in Quebec and two teenagers in the US have committed suicide in the past two years in the context of expressing their grief or their dark thoughts to their robot on Feature.AI and ChatGPT.

When a robot leads to psychosis

Psychologist Tania Lecomte is concerned that artificial intelligence could lead to psychosis, especially in clients suffering from psychotic disorders.

“If we think about ChatGPT, among other things, people may become psychotic from interacting with a robot. […] Ils [les robots] will come and feed the delirium,” she explains.

Last August, Microsoft AI CEO Mustafa Suleiman made a public appearance on social media.

AI can be wrong

Artificial intelligence experts also warn against so-called “hallucinations,” which are false information generated by robots.

“We don't necessarily know the database [des robots conversationnels]. Let's say this is the Internet. In addition to good information, there is also false information. There is a need to take a critical look,” says Ravi Por, an artificial intelligence specialist.



Mathematician and artificial intelligence expert Ravi Por believes high school students should take courses to better equip them with artificial intelligence.

Mathematician and artificial intelligence expert Ravi Por believes high school students should take courses to better equip them with artificial intelligence.

Photo by GENEVIEVE CHARBONNEAU

A recent study from Stanford University shows that chatbots can increase stigma and provide dangerous responses.

What is a chatbot

An artificial intelligence tool that allows you to communicate over text or in person.

The robot can take several forms, animated or stationary: cartoon characters, TV series, realistic images of a person created by artificial intelligence, or even a famous person.

There are several hundred types of conversational bots, free and paid. These apps cost between $20 and $200 per month for the ChatGPT Pro version.



The Nomi.ai app allows you to create your own virtual companion. The user can also choose what type of relationship they want to develop with the AI.

The Nomi.ai app allows you to create your own virtual companion. The user can also choose what type of relationship they want to develop with the AI.

Photo NOMI.AI

Extremely popular apps

*Number of downloads from January to September 2025
  1. ChatGPT: 667 million
  2. Gemini: 125 million
  3. Character.AI: 22 million.
  4. Replica: 2 million
  5. Kidroid: 530,000

In the second quarter of 2025, downloads of these apps grew by 323% year-over-year.

The number of monthly users of these applications has grown by 294% since the beginning of the year.

Source: Sensory Tower, digital market analysis platform

Leave a Comment