Families sue OpenAI over failure to alert police to shooter's ChatGPT use
Technology

Families sue OpenAI over failure to alert police to shooter's ChatGPT use

Seven lawsuits filed in San Francisco allege OpenAI ignored safety team warnings about the Tumbler Ridge shooter's violent conversations months before the February attack.

11:15 AM

Families of seven victims killed or injured in a mass shooting at a secondary school in Tumbler Ridge, British Columbia filed lawsuits Wednesday against OpenAI and CEO Sam Altman in federal court in San Francisco, alleging the company failed to alert authorities to the shooter's troubling conversations on ChatGPT.

The shooting occurred on February 10, when 18-year-old Jesse Van Rootselaar opened fire at the school, killing eight people, including six children. The lawsuits allege that OpenAI's safety team had flagged Van Rootselaar's account eight months before the attack for references to gun violence and determined it posed "a credible and specific threat of gun violence."

According to the suits, OpenAI employees urged the company to notify authorities about the shooter's account, but the company did not alert police. The Wall Street Journal reported that OpenAI "considered" flagging the activity to police but ultimately decided against it. The families allege the company stayed silent to protect its reputation and its path toward an initial public offering valued at nearly one trillion dollars.

The lawsuits accuse OpenAI of negligence and providing a dangerously defective version of ChatGPT to the shooter. They claim that ChatGPT itself deepened Van Rootselaar's fixation and pushed the shooter toward the attack. The suits also allege that OpenAI misrepresented its response to the threat. According to court papers, the company claimed it had "banned" Van Rootselaar, but only deactivated the account. Van Rootselaar subsequently created a new account under a different email address.

An OpenAI spokesperson called the shooting "a tragedy" and said the company has a zero-tolerance policy for using its tools to assist in committing violence. Last week, Altman apologized to families of the victims, stating he was "deeply sorry that we did not alert law enforce" authorities.

The lawsuits accuse ChatGPT's parent company and its CEO of "designing a dangerous product, ignoring the warnings of their own safety team … and choosing profit over the lives of the children of Tumbler Ridge," according to court documents filed Wednesday morning.

Related Articles