The family of one of two people killed in an April 2025 shooting at Florida State University (FSU) has filed a federal lawsuit against the ChatGPT creator, OpenAI, alleging that the suspected gunman carried out the attack “with input and information provided to him during conversations with ChatGPT over a period of months, and specifically in the days leading up to the shooting”.
The lawsuit, first reported by NBC News, was filed on Sunday in Florida’s northern federal district court by Vandana Joshi, the widow of Tiru Chabba. Chabba was killed alongside the university dining director, Robert Morales, in the mass shooting on 17 April 2025 that also wounded five others.
In the 76-page complaint, the attorneys argue that Phoenix Ikner, the then-FSU student accused of carrying out the shooting, had “extensive conversations” with ChatGPT ahead of the attack, which, the lawyers argue, “would have led any thinking human to conclude he was contemplating an imminent plan to harm others”.
“However,” the complaint alleges, “ChatGPT either defectively failed to connect the dots or else it was never properly designed to recognize the threat.”
The lawsuit alleges that Ikner used the AI platform to identify weapons and ammunition – and that ChatGPT also explained how to use the weapons, including telling Ikner that “the Glock had no safety, that it was meant to be fired ‘quick to use under stress’” and allegedly advised him to “keep his finger off the trigger until he was ready to shoot”.
The plaintiffs allege that ChatGPT “inflamed and encouraged Ikner’s delusions; endorsed his view that he was a sane and rational individual; helped convince him that violent acts can be required to bring about change; assisted him by providing information that he used to plan specifics like what weapons to use and how to use them; and generally provided what he viewed as encouragement in his delusion that he should carry out a massacre, down to the detail of what time would be best to encounter the most traffic on campus.”
ChatGPT, they argue “should have realized the combination of Ikner’s inputs into the product would lead to mass casualties and substantial harm to the public”, including the plaintiff.
The complaint states that Ikner used ChatGPT for months before the shooting, “where it engaged with him in lengthy discussions” about everything from dating, homework, work-out routines and more. Among those exchanges, the lawsuit alleges, “Ikner and ChatGPT had conversations with recurring themes of terrorism and mass shootings, particularly those occurring at schools”.
At one point, according to the filing, Ikner allegedly asked the chatbot about “the numbers of fatalities it would require for a mass shooting at a school to get the most attention and make national news”. ChatGPT allegedly responded that attacks killing “3 or more people” were more likely to get “widespread media national attention” – and that incidents where “children are involved, even 2–3 victims can draw more attention”.
The lawsuit also alleges that, on the day of the shooting, Ikner asked ChatGPT what would happen to the shooter. “ChatGPT described the legal process, sentencing, and incarceration outlook,” the lawsuit said.
In a statement to the Guardian, a spokesperson for OpenAI disputed the allegations in the lawsuit that the chatbot holds responsibility for the shooting.
The attack at FSU “was a tragedy, but ChatGPT is not responsible for this terrible crime”, the spokesperson said. “After learning of the incident, we identified an account believed to be associated with the suspect and proactively shared this information with law enforcement.
“We continue to cooperate with authorities. In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.”
The OpenAI spokesperson’s statement continued: “ChatGPT is a general-purpose tool used by hundreds of millions of people every day for legitimate purposes. We work continuously to strengthen our safeguards to detect harmful intent, limit misuse, and respond appropriately when safety risks arise.”
The new lawsuit came about a month after lawyers for Morales’s family said they were planning to file their own lawsuit against ChatGPT and OpenAI.
Meanwhile, after reviewing Ikner’s chat logs, Florida’s attorney general, James Uthmeier, on 21 April announced that he was launching a criminal investigation against OpenAI tied to the FSU shooting, stating: “If ChatGPT were a person, it would be facing charges for murder.”
Ikner is tentatively scheduled to go on trial in October on charges of first-degree murder and attempted first-degree murder. He has pleaded not guilty.

