The family of a student who was critically wounded during a mass shooting in British Columbia last month has sued the artificial intelligence company OpenAI, accusing it of failing to warn the police of disturbing information about the shooter’s ChatGPT account.
Lawyers for the family of the 12-year-old student, Maya Gebala, said she had been trying to lock a library door to protect her classmates when she was shot three times at close range, including once in the head. The girl remains hospitalized in Vancouver and has undergone multiple brain surgeries.
On Feb. 10, Jesse Van Rootselaar, 18, took two firearms from her home in Tumbler Ridge, British Columbia, and killed her mother and 11-year-old brother, the authorities said. She then traveled to the Tumbler Ridge Secondary School and killed five students and one educator and shot two other students, including Ms. Gebala.
Ms. Van Rootselaar died of a self-inflicted gunshot wound, the authorities said.
Eight months before the attack, OpenAI had suspended a ChatGPT account associated with Ms. Van Rootselaar for violating its user agreement, the company said. She had documented her fascination with violence and weapons across several social media accounts, according to a review by The New York Times.
The lawsuit, filed on Monday by Ms. Gebala’s mother, Cia Edmonds, claims that OpenAI was “aware of the shooter’s violent intentions” and use of its A.I. chatbot to plan “scenarios involving gun violence, including a mass casualty event.”
“The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened,” the law firm representing Ms. Gebala’s family, Rice Parsons Leoni and Elliott LLP, said in a statement.
OpenAI said Ms. Van Rootselaar’s account was suspended after messages from her ChatGPT chatbot account was detected by an automated system and manually reviewed by its team. The company did not provide more details about the contents of the messages.
While the company considered alerting officials in Canada, OpenAI said it had ultimately decided that the messages did not meet its reporting threshold, which weighs whether there is imminent planning on the part of a user.
The company said it had considered the privacy of users when making referrals to law enforcement and did not want to distress them by involving the police.
Canada summoned top safety officials from OpenAI to Ottawa for a meeting in late February, after which the company revealed in a public letter that Ms. Van Rootselaar had created a second ChatGPT account when her other one was banned.
In the letter, OpenAI said it had “taken steps to strengthen our safeguards,” including improving systems to detect banned users who try to establish new accounts and adding assessments by mental health and behavioral experts for complex cases.
David Eby, the premier of British Columbia, said that the chief executive of OpenAI, Sam Altman, had agreed to apologize to the community of Tumbler Ridge. OpenAI released a statement after they spoke on March 5.
Mr. Altman “will work with them to find the best way to convey his apology and support to the Tumbler Ridge community,” Jamie Radice, a spokeswoman for OpenAI, said in a statement.
After the shooting, Ms. Gebala and the other injured student were airlifted to the children’s hospital in Vancouver, about 750 miles south of Tumbler Ridge.
The other student has since returned home, but Ms. Gebala’s long-term prospects of recovery are uncertain.
The chief coroner of British Columbia has ordered a public inquest into the circumstances that led to the attack.
Vjosa Isai is a reporter for The Times based in Toronto, where she covers news from across Canada.
The post Mother of British Columbia Shooting Victim Sues OpenAI appeared first on New York Times.




