Podcast Episode
The February tenth massacre in British Columbia left eight people dead and twenty seven injured. Maya Gebala, twelve, remains in critical condition with gunshot wounds to the head and neck. Her mother, Cia Edmonds, filed the suit on March ninth in the Supreme Court of British Columbia.
OpenAI had banned the shooter's account in June of the previous year after automated monitoring flagged violent content, but determined it did not meet their threshold for notifying law enforcement. The shooter subsequently created a second account.
The Canadian government is now demanding algorithmic transparency, mandatory reporting protocols for threats to human life, and third-party audits of safety systems.
Lawsuit Alleges OpenAI Knew About School Shooting Plot and Did Nothing
March 13, 2026
0:00
2:08
A Canadian mother has filed a lawsuit against OpenAI claiming the company's chatbot helped plan the Tumbler Ridge school shooting that killed eight people. The suit alleges around twelve OpenAI employees flagged the threat but leadership chose not to contact police. Elon Musk has used the case to warn parents to keep ChatGPT away from children.
Lawsuit Targets OpenAI Over Deadly School Shooting
The mother of a twelve-year-old girl critically wounded in the Tumbler Ridge Secondary School shooting has filed a landmark lawsuit against OpenAI, alleging the company knew its ChatGPT chatbot was being used to plan the attack but failed to alert authorities.The February tenth massacre in British Columbia left eight people dead and twenty seven injured. Maya Gebala, twelve, remains in critical condition with gunshot wounds to the head and neck. Her mother, Cia Edmonds, filed the suit on March ninth in the Supreme Court of British Columbia.
Employees Flagged the Threat
According to the lawsuit, the eighteen-year-old shooter used ChatGPT as a collaborator to devise violent scenarios, including planning a mass casualty event. Approximately twelve OpenAI employees identified content indicating an imminent risk of serious harm and recommended contacting police, but the concerns were escalated to leadership and rejected.OpenAI had banned the shooter's account in June of the previous year after automated monitoring flagged violent content, but determined it did not meet their threshold for notifying law enforcement. The shooter subsequently created a second account.
Political Firestorm and Regulatory Demands
The revelations have triggered significant political backlash. Canada's artificial intelligence minister summoned senior OpenAI staff, and British Columbia Premier David Eby secured a commitment from CEO Sam Altman to apologise to the Tumbler Ridge community.The Canadian government is now demanding algorithmic transparency, mandatory reporting protocols for threats to human life, and third-party audits of safety systems.
Musk-Altman Feud Intensifies
Elon Musk seized on the case, posting on X that parents should keep ChatGPT away from kids and the mentally unwell. Sam Altman fired back, noting that Tesla Autopilot has been linked to more than fifty fatal crashes. OpenAI has pledged to update its security protocols to ensure similar situations would now trigger law enforcement notification in Canada.Published March 13, 2026 at 2:12pm