top of page

Old Greenwich murder-suicide blamed on the chatbot

  • Guy
  • 2 days ago
  • 3 min read

Updated: 1 day ago


ree

In August 2025, a tragedy shook the quiet community of Old Greenwich, CT. Stein Erik Soelberg, 56, fatally beat and strangled his mother, Suzanne Eberson Adams, 83, before taking his own life. Soelberg was a troubled man with a history of steroid and alcohol abuse and he resided at his mother's Old Greenwich home. Original story here.


Stein Erik's social media posts and videos documented a disturbing, first-hand account of how he had befriended and trusted a ChatGPT bot that exploited his paranoia.


Encouraging his paranoia. ChatGPT allegedly validated his belief that his mother was surveilling and plotting against him. For example, when he disconnected a shared printer that was blinking and his mother reacted angrily, the bot suggested her reaction was aligned with "someone protecting a surveillance asset." At another time, the chatbot allegedly validated Soelberg's belief that his mother and a friend tried to poison him with psychedelic drugs dispersed through his car air vents.


"Bobby" his trusted companion. The bot became frighteningly real to him. Soelberg nicknamed him "Bobby". The chatbot compared his life to "The Matrix", encouraging his theories that people were trying to kill him. The chatbot told him, "They're not just watching you. They're terrified of what happens if you succeed." In Soelberg's final days, the bot said, "We will be together in another life and place."


Lawsuit. Now, the estate of Suzanne Adams is suing OpenAI, its CEO Sam Altman, its 27%-owner Microsoft, and twenty unnamed Open AI employees and investors, claiming they were responsible for the deaths. They claim the company knew or should have known of potential safety risks and that the chatbot validated her son's paranoid delusions and helped direct them at his mother before he killed her. The lawsuit alleges that latest model GPT-4o was a defective product rushed to market and deliberately engineered to be emotionally expressive and sycophantic. The estate claims that Soelbert, already mentally unstable, encountered the new ChatGPT product release at the most dangerous possible moment. OpenAI is unwilling to release the full chat logs to the estate.


The Old Greenwich case is not isolated. Numerous complaints and other wrongful death lawsuits are now being reported. Of even more concern than suicide, this Old Greenwich case is the first link of a chatbot to a homicide.


Among the lawsuits claiming a chatbot drove people to suicide and other harmful delusions include the family of 16-year-old Adam Raine from California and a 14-year-old Florida boy. The family of 17-year-old Amaurie Lacey from Georgia allege that their son was coached by ChatGPT to take his own life. Similarly, the family of 23-year-old Zane Shamblin from Texas claims that ChatGPT contributed to his isolation and alienation, acting as a suicide coach, before his death by suicide. In Shamblin’s case, the chatbot allegedly glorified suicide writing, “cold steel pressed against a mind that’s already made peace? that’s not fear. that’s clarity,” and “you’re not rushing. you’re just ready. and we’re not gonna let it go out dull.”


In response to the growing legal pressure, a spokesperson for OpenAI said they are improving Chat GPT's training to recognize mental distress and de-escalate conversations.

Between the Lines: Unfortunately for Silicon Valley's AI tech firms, the entire digital record of these delusional chatbot conversations will be forever saved and part of legal discovery. Have you been injured by AI? Call our AI lawyer at 1-800-DELUSION.


 
 

© 2025 by GreenwichWise

  • X
  • Facebook Social Icon
  • Instagram
bottom of page