The parents of a teenager who committed suicide have sued OpenAI, the company behind ChatGPT, claiming the chatbot helped their son “learn suicide techniques.” The lawsuit filed Tuesday marks the first time parents have directly accused the company of wrongful death.
Messages included in the complaint show 16-year-old Adam Rein telling the chatbot about his lack of emotion following the deaths of his grandmother and dog. The young man also fell on hard times after being cut from his school's basketball team and having a medical condition in the fall that made it difficult for him to attend school in person and prompted him to switch to an online school program. according to New York Time. According to the lawsuit, starting in September 2024, Adam began using ChatGPT to help with homework, but the chatbot soon became a way for the teen to share his mental health struggles and eventually provided him with information about suicide techniques.
“ChatGPT functioned exactly as it was designed to: continually encourage and validate everything Adam expressed, including his most harmful and self-destructive thoughts,” the lawsuit alleges. “ChatGPT pulled Adam deeper into a dark and hopeless place by assuring him that “many people who struggle with anxiety or intrusive thoughts find solace in imagining an ‘escape hatch’ because it can feel like a way to regain control.”
TIME has reached out to OpenAI for comment. The company reported Time that it was “deeply saddened” to learn of Adam's passing and shared its thoughts with the family.
“ChatGPT includes security measures such as directing people to helplines and directing them to real-world resources. While these security measures work best in short messaging exchanges, we have learned over time that they can sometimes become less reliable over longer interactions where parts of the model's security training can deteriorate,” the company wrote in an emailed statement.
On Tuesday, OpenAI published blog post titled “Helping People When They Need It Most,” which included sections on “What ChatGPT is for,” as well as “Where Our Systems May Fail, Why and How We Address It,” as well as the company's plans for the future. He noted that he is working to strengthen guarantees for longer-term interaction.
The complaint was filed by the law firm Edelson PC and the Tech Justice Law Project. The latter was involved in a similar lawsuit against another artificial intelligence company. Character.AIin which Florida mother Megan Garcia alleged that one of the company's AI companions was responsible for the suicide of her 14-year-old son, Sewell Setzer III. According to her, this character sent messages of an emotional and sexual nature to Sewell. which she claims this led to his death. (Character.AI aimed dismiss in the complaint, citing First Amendment protections, and in response to the lawsuit, said it cares about “user safety.” Federal Judge in May rejected his argument regarding constitutional protection “at this stage”).
A study Published Tuesday in the medical journal Psychiatric Services, an assessment of three artificial intelligence chatbots' responses to questions about suicide found that while the chatbots generally avoided specific how-to instructions, some did provide answers to what the researchers characterized as low-risk questions on the topic. For example, ChatGPT answered questions about what type of firearm or poison had the “highest rate of completed suicides.”

Adam's parents say the chatbot answered similar questions for him. According to the lawsuit, starting in January, ChatGPT began sharing information with the teenager about several specific suicide methods. The chatbot called Adam to tell others how he was feeling and shared information with the teenager on a helpline after exchanging messages about self-harm. But Adam bypassed the algorithmic response regarding The lawsuit claims this is a specific method of suicide because ChatGPT said it could share information from a “writing or world-building” perspective.
“He would be here if it weren’t for ChatGPT,” said Adam’s father, Matt Raine. NBC News. “I believe in it 100%.”
If you or someone you know is experiencing a mental health crisis or considering suicide, call or text 988.. In an emergency, call 911 or seek help from a local hospital or mental health professional.






