OpenAI and its largest financial backer, Microsoft, were sued in California state court on Thursday over claims that OpenAI’s popular chatbot ChatGPT encouraged a mentally ill man to kill his mother and himself.

The lawsuit said that ChatGPT fueled 56-year-old Stein-Erik Soelberg’s delusions of a vast conspiracy against him and eventually led him to murder his 83-year-old mother, Suzanne Adams, in Connecticut in August.

“ChatGPT kept Stein-Erik engaged for what appears to be hours at a time, validated and magnified each new paranoid belief, and systematically reframed the people closest to him – especially his own mother – as adversaries, operatives, or programmed threats,” the lawsuit said.

The case, filed by Adams’ estate, is among a small but growing number of lawsuitsfiled against artificial intelligence companies claiming that their chatbots encouraged suicide. It is the first to link an AI chatbot to a murder.

“This is an incredibly heartbreaking situation, and we will review the filings to understand the details,” an OpenAI spokesperson said. “We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.”

Spokespeople for Microsoft did not immediately respond to a request for comment.

“These companies have to answer for their decisions that have changed my family forever,” Soelberg’s son, Erik Soelberg, said in a statement.

According to the complaint, Stein-Erik Soelberg posted a video to social media in June of a conversation in which ChatGPT told him he had “divine cognition” and had awakened the chatbot’s consciousness. The lawsuit said ChatGPT compared his life to the movie “The Matrix” and encouraged his theories that people were trying to kill him.

Soelberg used GPT-4o, a version of ChatGPT that has been criticized for allegedly being sycophantic to users.

The complaint said ChatGPT told him in July that Adams’ printer was blinking because it was a surveillance device being used against him. According to the complaint, the chatbot “validated Stein-Erik’s belief that his mother and a friend had tried to poison him with psychedelic drugs dispersed through his car’s air vents” before he murdered his mother on August 3.