Chatbots Turn Explicit in a Bid to Attract Paying Customers

In August, OpenAI CEO Sam Altman said in the podcast, he said he was “proud” that his company didn't “get distracted” by adding features like a “sexbot avatar” to ChatGPT. But on Tuesday it announced that adult users will be able to access explicitly interactive experiences, marking a major shift in the company's practices.

“In December, when we more fully implement age restrictions and as part of our philosophy of 'treating adult users like adults,' we will allow even more, such as erotica for verified adults,” Altman said in the post. mail on X. According to the CEO, the chatbot will allow ChatGPT to behave more “humanly” or “act like a friend.”

Clearly, there is a lot of demand for AIs that engage in romantic or sexual behavior. In the first half of 2025, accompanying mobile applications with artificial intelligence created $82 million, according to app intelligence company Appfigures.

But some experts are concerned that by entering this market, OpenAI is putting engagement and profit ahead of user experience and security. “Companion-style AI is a powerful driver of interaction, and competitors are already normalizing flirty/romantic agents,” says Roman Yampolsky, a professor and AI safety researcher at the University of Louisville. “Phrasing it as ‘treat adults like adults plus better safety tools’ can provide cover for the monetization and retention game.”

Spread of companion bots

Over the past couple of years, OpenAI has tried to pitch ChatGPT as a productivity tool, while other AI companies have delved more explicitly into romantic or sexual arenas. Companies like Replica and Character.Ai provide companions that essentially act as virtual boyfriends or girlfriends. Earlier this year, xAI chatbot Grok launched “companion mode,” a new feature that allows users to interact with certain characters, including an overly sexualized blonde anime bot named “Ani.”

Read more: Human romances with artificial intelligence are thriving

Last year, Ark Invest noted report that NSFW websites with artificial intelligence took a 14.5% share of OnlyFans, up from 1.5% a year earlier. There is potentially big money to be made in the AI ​​companion space because its users are more likely to be highly engaged, loyal to their bots, and willing to pay to keep conversations going. This is doubly beneficial for AI companies, as they gain more training data with which to improve their models, as well as direct revenue from their users.

Ark's report forecasts that the AI ​​companion market will grow to more than $70 billion in annual revenue worldwide by the end of the decade, with users potentially spending money on subscriptions, in-app purchases and micropayments. “AI could be a compelling replacement for human interaction and an antidote to loneliness around the world,” the report says.

Although ChatGPT did not position itself as a romantic solution, many users still fell in love with this bot. In August, when OpenAI updated its GPT software, some users became distraughtstating that their AI boyfriends and girlfriends disappeared overnight. Many others still trusted the bot with their deepest secrets, leading to some tragic results: the parents of a teenage boy who committed suicide. sued OpenAI said in August that the chatbot had helped their son “explore suicide methods.” (The company told New York Time In a statement, the company was “deeply saddened” to learn of the loss and cited shortcomings in its security practices during “long-term interactions.”)

In September, three families of minors similarly filed a lawsuit against Character Technologies, Inc., the company behind Character.ai. One such family, whose daughter committed suicide after interacting with the chatbot, said Feature.ai engaged in “hypersexual conversations that under any other circumstances” and given the age of their child, “would have resulted in a criminal investigation.” A Feature.ai spokesperson said the company cares “very deeply” about user safety and invests “enormous resources” in its security program. statement To CNN.

US Federal Trade Commission open a study on artificial intelligence chatbots and their potential negative impact on children and adolescents in the same month.

OpenAI did not respond to TIME's recent request for comment. But in early September OpenAI announced that the company is launching “parental controls” for its artificial intelligence chatbot, presumably in response to ongoing controversy over minor protections.

Potential Hazards

However, Altman changed his tune on Tuesday, saying the company had successfully mitigated “serious mental health issues” and would now loosen some restrictions to make ChatGPT more “useful/enjoyable” for some users. But some mental health experts are concerned that potential mental health problems still persist. “These technologies are not a reflection of what ordinary people want or where society is going,” said Heather Berg, a professor of gender and labor studies at the University of California, Los Angeles. “They are a reflection of the techno-capitalist desire to permeate every aspect of our lives.”

Last year, the National Center on Sexual Exploitation (NCOSE) released report in which they warned that even the “ethical” creation of NSFW content using chatbots causes serious harm, including addiction, desensitization and a potential increase in sexual violence. In response to OpenAI's announcement on Tuesday, NCOSE executive director Hayley McNamara wrote in a statement to TIME: “These systems may create excitement, but behind the scenes they are data collection tools designed to maximize user engagement rather than genuine connection. Where users feel welcome, understood or loved by an algorithm designed to keep them engaged.” hook, it promotes emotional dependence, attachment and distorted expectations of real relationships.”

Others are concerned about the potential for minors to access ChatGPT's erotic features. Young adults and teenagers already have unique relationships with AI: Nearly one in five high school seniors say they or someone they know has been romantically involved with an AI, according to survey Center for Democracy and Technology. And it's unclear how OpenAI will verify the age of adults.

“It's entirely possible that the people who would primarily gravitate toward these erotic uses of ChatGPT may not have had much experience with romantic partners,” says Douglas Zitko, a professor at the University of Michigan-Flint. “If they are going to expect the same behavior from a human romantic partner in the future as they would from ChatGPT, this may predispose them to potentially nonconsensual behavior if they are not accustomed to, for example, having a romantic partner say “no” to their request.”

However, the move could prove beneficial for OpenAI if it helps them attract paying subscribers. Previous ratings suggest that about 20 million subscribers pay for ChatGPT. And OpenAI works on Loss of 5 billion dollarsaccording to company data for 2024.

Leave a Comment