Family of Canadian school shooting victim files lawsuit against OpenAI

The family of a 12-year-old girl critically injured in a deadly school shooting in Canada has filed a lawsuit against OpenAI, alleging the company could have taken steps that might have prevented the attack, Qazinform News Agency correspondent reports.

photo: QAZINFORM

The case was filed in the Supreme Court of British Columbia by Cia Edmonds on behalf of herself and her two daughters, following the February shooting in the small community of Tumbler Ridge.

Authorities said nine people were killed in the attack on February 10, including five students aged 12 to 13 and a school staff member. The suspect, 18-year-old Jesse Van Rootselaar, also died during the incident.

Edmonds’ daughter, Maya, survived the shooting but sustained three gunshot wounds, including one above her left eye. According to the lawsuit, she suffered a traumatic brain injury and now lives with permanent cognitive and physical disabilities.

The lawsuit states that the shooter had previously interacted extensively with the ChatGPT chatbot, including discussions that involved violent scenarios with firearms. According to the filing, the suspect relied on the chatbot not only for information but also for emotional support.

"The Shooter relied on ChatGPT for mental health support and counselling, treating ChatGPT as a mental health counsellor, advisor and/or pseudo-therapist," the lawsuit states. “Further, and/or in the alternative, the Shooter relied on and perceived ChatGPT to be a trusted confidante, collaborator, ally, and friend.”

According to reports, the activity was flagged by the company’s automated review system months before the attack. OpenAI said the account was banned in June after the content was identified as concerning. However, the company stated it did not notify law enforcement because the messages did not indicate a “credible or imminent” threat at the time.

The legal claim further argues that the chatbot had access to large amounts of information related to violent acts and alleges that safeguards were insufficient to prevent users from obtaining such material.

“At all material times, OpenAl knew that ChatGPT possessed extensive knowledge and capabilities - including the ability to provide detailed, actionable information on dangerous or harmful subjects like how to conduct a mass casualty event like the Tumbler Ridge Mass Shooting. OpenAl harvested such harmful information and data in an indiscriminate manner and then supplied such information and data to ChatGPT. OpenAl took no steps - adequate or at all - to avoid providing ChatGPT with such information and data, or impose any safeguards to prevent users from obtaining such information from ChatGPT,” the lawsuit alleges. “Possessing vast amounts of harmful information and the technical ability to distill it, ChatGPT equipped the Shooter with information, guidance, and assistance to plan a mass casualty event like the Tumbler Ridge Mass Shooting, including informing the Shooter about the various methods of carrying out a mass casualty event like the Tumbler Ridge Mass Shooting, the types of weapons to be used, and describing precedents from other mass casualty events or historical acts of violence.”

The case has also drawn the attention of Canadian officials. Evan Solomon, Canada’s minister responsible for artificial intelligence and digital innovation, previously called representatives of OpenAI to Ottawa after learning the suspect had been banned from the platform months before the shooting.

For now, the allegations outlined in the lawsuit have not been tested in court.

As Qazinform reported earlier, the tragedy has also reignited debates over gun control in Canada.