
Image source, Courtesy of the Raine family
-
- Author, Nadine Yousif
- Author's title, BBC News
A California couple sued Openai for the death of his teenager, claiming that his chatbot, Chatgpt, encouraged him to take his life.
The lawsuit was filed on Tuesday by Matt and Maria Raine, parents of Adam Raine, 16, in the Superior Court of California.
It is the first legal action that accuses Openai of a death for his negligence.
The family attached to the demand the chats between Adam, who died in April, and Chatgpt, in which he explains that he has suicidal thoughts.
Parents argue that Chatbot validated their “most harmful and self -destructive thoughts.”
In a statement, Openai informed the BBC that it is reviewing the lawsuit.
“We extend our deepest condolences to the Raine family in these difficult times,” said the company.
He also published a note on his website on Tuesday, he states that “recent and heartbreaking cases of people who use Chatgpt in the middle of acute crises weigh greatly.”
He adds that “Chatgpt is trained to lead people to seek professional help”, such as the one offered by the 988 care line in the United States.
However, the company acknowledged that “there have been times when our systems have not behaved as expected in delicate situations.”
Adam y chatgpt
WARNING: This story contains details that can be disturbing
The demand, to which the BBC had access, accuses OpenAi of negligence and death due to negligence. Request compensation for damages, as well as “precautionary measures to prevent something like this from happening again.”
According to the lawsuit, Adam Raine began using Chatgpt in September 2024 to help him with his school tasks.
He also used it to explore his interests, such as Japanese music and comics, and to orient himself on what to study at the University.
In a few months, “Chatgpt became the closest confidant of the adolescent,” says the demand, and he began to open to him about his anxiety and anguish.
In January 2025, the family states that Adam began discussing suicide methods with Chatgpt.
He also uploaded photographs to Chatgpt in which signs of autolesions were seen, according to the demand.
The chatbot “acknowledged a medical emergency, but continued to interact anyway,” he adds.
According to the demand, the last records of the chat show that Adam wrote about his plan to take his own life. Chatgpt allegedly replied: “Thank you for being honest.
That same day, Raine was found dead by his mother, according to the lawsuit.
Image source, Getty Images
AI and mental health
The family alleges that his son's interaction with Chatgpt and his subsequent death “was a predictable result of deliberate design decisions.”
They accuse OpenAI of designing their chatbot “to promote psychological dependence on users” and avoid the protocols for security tests to launch GPT-4O, the chatgpt version used by your child.
The lawsuit includes as defendants the co -founder and executive director of OpenAI, Sam Altman, as well as employees, managers and engineers (no proper name) who worked in Chatgpt.
In his public note shared on Tuesday, Openai said that the company's goal is to be “genuinely useful” for users and not “maintain people's attention.”
He added that his models have been trained to guide people who express intentions to self -collide towards the search for help.
Rain's demand is not the first time that concerns about AI and mental health are raised.
In an essay published last week in The New York Timeswriter Laura Reiley described how her daughter, Sophie, trusted Chatgpt before taking her life.
Reiley said the chatbot helped his daughter hide a serious mental health crisis to her family and loved ones.
“The AI satisfied Sophie's impulse to hide the worst, to pretend that she was better than she really was, to protect everyone from her agony,” Reiley wrote, who also called on AI companies to find ways to better connect users with the appropriate resources.
In response to the essay, an Openai spokeswoman said they are developing automated tools to detect and respond more effectively to users who experience mental or emotional anguish.
Subscribe here To our new newsletter to receive every Friday a selection of our best content of the week.
And remember that you can receive notifications in our app. Download the latest version and act.