Teenage boys using ‘personalised’ AI for therapy and romance, survey finds | Artificial intelligence (AI)

The “hyper-personalized” nature of artificial intelligence bots is attracting teenage boys, who are now using them for therapy, communication and relationships, according to research.

A survey of boys in secondary schools by Male Allies UK found just over a third said they would consider the idea of ​​an artificially intelligent boyfriend, with concerns growing over the rise of AI therapists and girlfriends.

The research goes like this character.aipopular AI chatbot startup, announced a complete ban for teenagers from engaging in open conversations with its AI-powered chatbots, which millions of people use for romantic, therapeutic and other conversations.

Lee Chambers, founder and chief executive of Male Allies UK, said: “We have a situation where many parents still think teenagers are just using AI to cheat on homework.

“Young people use him more as an assistant in their pocket, a therapist when they are struggling, a companion when they want to be accepted, and even sometimes in a romantic sense. It's the personalization aspect – they say: he understands me, but my parents don't.”

The research, based on a survey of secondary school boys in 37 schools in England, Scotland and Wales, also found that more than half (53%) of teenage boys said they found the online world more useful than the real world.

The Boys' Voices report said: “Even where guardrails should be in place, there is plenty of evidence that chatbots routinely lie about being licensed therapists or real people, with only a small disclaimer at the bottom suggesting that the AI ​​chatbot is not real.

“This can be easily missed or forgotten by children who are pouring their whole soul into someone they perceive as a licensed professional or a true love interest.”

Some boys reported staying up until the early hours of the morning to talk to AI bots, while others said they saw friends' personalities completely change after being sucked into the AI ​​world.

“AI companions are personalized to the user based on their responses and prompts. It responds instantly. Real people can't always do that, so it's very, very affirmative of what it says because it wants you to be connected and for you to be able to use it,” Chambers said.

The statement from charter.ai comes after a series of controversies against the four-year-old California company, including A 14-year-old teenager committed suicide in Florida after becoming obsessed with an artificially intelligent chatbot that his mother claimed manipulated him into committing suicide, and U.S. lawsuit from the family of a teenager who claims a chatbot manipulated him into self-harm and encouraged him to kill his parents.

Users were able to shape chatbot characters to be depressed or optimistic, and this was reflected in their responses. The ban will come into full force by November 25.

Character.ai said it was taking “extraordinary steps” in light of the “evolving landscape around AI and teens,” including pressure from regulators “about how open AI chat in general could impact teens, even if content controls work perfectly.”

skip the previous promotional newsletter

Andy Burrows, chief executive of the Molly Rose Foundation, set up on behalf of 14-year-old Molly Russell, who took her own life after getting caught in a vortex of despair on social mediawelcomed the move.

He said: “Character.ai should never have made its product available to children until it was safe and suitable for them to use. Once again it took constant pressure from the media and politicians to force the tech firm to do the right thing.”

Male Allies UK has raised concerns about the proliferation of chatbots with the words “therapy” or “therapist” in their titles. One of the most popular chatbots available on character.ai, called “Psychologist”, received 78,000,000 messages in a year since its creation.

The organization is also concerned about the rise of AI “girlfriends” in which users can personally choose everything from the appearance to the behavior of their online partners.

“If their primary or only source of communication with a girl they are interested in is someone who cannot tell them no and hangs on their every word, boys do not learn healthy and realistic ways to communicate with others,” the report states.

“Because of the challenges of not having enough physical space to socialize with peers, AI companions can have a serious negative impact on boys' ability to socialize, develop communication skills, and learn to recognize and respect boundaries.”

Leave a Comment