The family of one of the victims of last year’s mass shooting at Florida State University have filed a lawsuit against the artificial intelligence chatbot the alleged shooter consulted before the attack.
Lawyers representing 45-year-old Tiru Chabba’s estate announced the lawsuit filed Monday in Tallahassee federal court.
The legal team alleges OpenAI, owner of ChatGPT, was negligent in creating a product without an ability or willingness to push back harder against Phoenix Ikner, whom the lawyers allege used the chatbot as his “co-conspirator.”
Bakari Sellers, one of the attorneys representing Chabba’s family, said the legal team is “not going to allow the American public to have clinical trials run on them by OpenAI and ChatGPT.
“This issue is not about politics at all; in fact, it’s about the duty owed to the American public, it’s making sure that other individuals like Ikner do not get their hands on weapons and are able to carry out mass murder with their co-conspirator, ChatGPT,” Sellers said during a news conference outside the courthouse.
Chabba’s attorneys allege negligence, gross negligence, and liability for negligent and defective design against OpenAI, seeking compensatory damages, litigation costs, and any other costs the court decides.
The suit also lists Ikner as a defendant, alleging battery, and asks for compensatory damages and litigation costs.
“Ikner had extensive conversations with ChatGPT, which, cumulatively, would have led any thinking human to conclude he was contemplating an imminent plan to harm others. However, ChatGPT either defectively failed to connect the dots or else it was never properly designed to recognize the threat,” the lawsuit states.
The legal team representing Chabba’s family allege wrongful death against OpenAI and Ikner, too.
“OpenAI knew this would happen. It’s happened before and it was only a matter of time before it happened again,” Chabba’s widow, Vandana Joshi, said in a news release. “But they chose to put their profits over our safety and it killed my husband. They need to be responsible before another family has to go through this.”
Chabba, a South Carolinian on FSU’s campus that day as part of his job with a campus vendor, Aramark Collegiate Hospitality.
OpenAI has forecast a $1 trillion valuation in advance of its IPO later this year, the lawsuit points out.
“The OpenAI Defendants failed to create a product that would refrain from participating in discussions that amounted to it co-conspiring with Ikner to commit those crimes” and “appropriately alert a human that investigation by law enforcement may be necessary to prevent a specific plan for imminent harm to the public.”
The lawsuit attempts to lay groundwork to prove OpenAI rushed its products to market without effectively gauging the risk to the public.
The lawsuit alleges OpenAI ignored risks of its product “in favor of getting to market quickly to unleash it for use by humans when it was fully aware of the likelihood of harm to humanity.”
The lawsuit invokes the words of Florida Attorney General James Uthmeier, who has said that a human would be charged for murder if in the place of ChatGPT. He is investigating whether ChatGPT is criminally liable for the shooting.
“The way ChatGPT responded to Phoenix Ikner was the foreseeable result of a system that had been designed to keep a user engaged and generate polished outputs rather than refuse, interrupt, or reality-test when a user was plainly exhibiting paranoia, delusion, fixation, and hostility toward the public,” the lawsuit states.
The lawsuit points to Ikner’s questions about politics, religion, and school shootings.
“Ikner’s conversations with ChatGPT fed his delusions and felt supportive to him, which emboldened and encouraged him to carry out the plan,” the lawsuit states.
OpenAI’s “actions and omissions,” the suit argues, “were intentional, oppressive, malicious, reckless, wanton, fraudulent, beyond all standards of decency, and constituted a conscious disregard or indifference to the life, safety or rights of Tiru Chabba.”
Review the messages
Last month, the Florida Phoenix reviewed 13,000 messages between Ikner and ChatGPT, which started in March 2024 and ended minutes before the shooting on FSU’s campus in April 2025.
Those messages were provided to the Phoenix by the family of Robert Morales, also killed that day. That family has said it intends to file a lawsuit, too. In addition to the deaths, Ikner wounded six people.
That report detailed how Ikner asked the artificial intelligence chatbot how to use the weapons at his disposal, when the student union might be busy, and how the news media and Florida’s government treat school shooters. He told ChatGPT the night before the shooting that he believed God had given up on him.
A spokesperson for OpenAI told the Phoenix in an emailed statement last month that the company has been in “proactive” contact with law enforcement about Ikner’s account.
“Our hearts go out to everyone affected by this devastating tragedy. After learning of the incident in late April 2025, we identified a ChatGPT account believed to be associated with the suspect, proactively shared this information with law enforcement and cooperated with authorities,” the spokesperson said.
The company representative did not answer questions about OpenAI’s reporting system, how it’s set up, or when the company contacted law enforcement.
“We build ChatGPT to understand people’s intent and respond in a safe and appropriate way, and we continue improving our technology.”
The death penalty trial for Ikner’s murder charges is scheduled for October.
–Jay Waagmeester, Florida Phoenix
























Leave a Reply