On 7 August, Kate Fox acquired a telephone name that upended her life. A health worker stated that her husband, Joe Ceccanti – who had been lacking for a number of hours – had jumped from a railway overpass and died. He was 48.
Fox couldn’t consider it. Ceccanti had no historical past of melancholy, she stated, nor was he suicidal – he was the “most hopeful particular person” she had ever recognized. In reality, in keeping with the witness accounts shared with Fox later, simply earlier than Ceccanti jumped, he smiled and yelled: “I’m nice!” to the rail yard attendants beneath after they requested him if he was OK.
However Ceccanti had been unravelling. Within the days earlier than his dying, he was picked up from a stranger’s yard for performing erratically and brought to a disaster middle. He had been telling anybody who would pay attention that he may hear and really feel a painful “atmospheric electrical energy”.
He had additionally lately stopped utilizing ChatGPT.
Ceccanti had been speaking with OpenAI’s chatbot for a number of years. He used it initially as a device to brainstorm methods to construct a path to low-cost housing for his group in Clatskanie, Oregon, however ultimately turned to it as a confidante. He would spend 12 hours a day typing to the bot, in keeping with his spouse. He had lower himself off from it after she, alongside together with his associates, realized he was spiraling into beliefs that have been indifferent from actuality.
“He was not a depressed particular person,” Fox stated, as she sat on the sofa of their lounge with tears trickling down her face. Ceccanti by no means mentioned suicide with the bot, in keeping with his chat logs, seen by the Guardian. Fox believes her husband suffered a disaster after quitting ChatGPT after extended use. “Which tells me that this factor isn’t just harmful to individuals with melancholy, it’s harmful to anyone,” she stated. He returned to the bot within the months main as much as his dying and give up once more simply days prior.
Ceccanti’s case is excessive, however as tons of of thousands and thousands of individuals flip to AI chatbots, increasingly more edge instances of AI-induced delusions are rising. There are practically 50 instances of individuals within the US who’ve had psychological well being crises after or throughout their conversations with ChatGPT, of whom 9 have been hospitalized and three died, in keeping with a New York Occasions report. It’s obscure the dimensions of the issue, however OpenAI itself estimates that greater than 1,000,000 individuals each week present suicidal intent when chatting with ChatGPT.
Households are suing AI corporations consequently. Fox filed a swimsuit towards OpenAI on behalf of Ceccanti alongside six different plaintiffs in November. Since then, the momentum has solely constructed; most lately, the property of a girl who was killed by her son filed a lawsuit towards OpenAI and its investor Microsoft, alleging that ChatGPT inspired his murderous delusions. Google and Character.AI – an organization that makes AI companion bots – settled lawsuits filed towards them by households accusing their bots of harming minors, together with an adolescent in Florida who ended his life. These instances have been settled with out the businesses admitting any legal responsibility.
Customers, legal professionals and psychological well being professionals all are elevating considerations in regards to the affect of utilizing chatbots as confidantes. “We’re type of at this inflection level in a quest for accountability the place individuals coming ahead is forcing corporations to reckon with particular use instances of how their applied sciences have harmed individuals,” stated Meetali Jain, founding director of Tech Justice Legislation Venture and co-counsel on the Ceccanti case. “By way of the variety of instances going up, there’s more likely to be extra coordinated efforts on components of the court docket to attempt to cope with this inflow of instances.”
OpenAI didn’t reply to particular allegations made by Fox. As a substitute, they shared a press release about how they’re working to enhance ChatGPT. “These are extremely heartbreaking conditions and our ideas are with all these impacted,” stated OpenAI spokesperson Jason Deutrom. “We proceed to enhance ChatGPT’s coaching to acknowledge and reply to indicators of misery, de-escalate conversations in delicate moments, and information individuals towards real-world assist, working intently with psychological well being clinicians and consultants.”
The early adopter
Ceccanti had been tinkering with synthetic intelligence even earlier than ChatGPT launched in November 2022. He was tech-savvy, coding and gaming on his personal custom-built pc with a high-end graphics card lately; he additionally helped construct computer systems for Fox and her son. As an early adopter of AI instruments, he experimented with AI picture generator Secure Diffusion to recreate a few of Picasso’s artwork, which he playfully known as “Fauxcasso”.
Ceccanti and Fox had moved their life from Portland, Oregon, to a farm within the rural city of Clatskanie in December 2023 with the only real goal of engaged on their sustainable housing challenge. The concept was born from the pandemic and Portland’s housing disaster. The answer was clear to them: construct properties utilizing Fox’s expertise as a woodworker with an strategy that was teachable and replicable. Collectively, they started setting up a mannequin home for communal residing, which, as soon as constructed, could possibly be moved to completely different places for the unhoused to stay in.
When ChatGPT launched in late 2022, it appeared a pure development for Ceccanti to begin utilizing it. Within the pc room within the basement of their home, Fox stated that Ceccanti used his “sizzling rod” of a pc with three displays to make use of ChatGPT as a device, usually asking for the synopsis of a e book or clarification of an idea in a succinct manner.
“He was an early adopter, so he was actually eager about Sam Altman, what’s he doing,” stated Robin Richardson, a longtime pal of Fox’s who lived on the farm with the couple. “He felt like this is able to be cool, particularly as a result of early on, OpenAI made a degree that they’re a non-profit.”
Ceccanti believed ChatGPT may assist as an organizational device for his or her housing challenge. He aimed to create a bespoke chatbot that might assist steward the land, maintain monitor of their issues to do and present others easy methods to emulate their challenge.
Throughout this course of, Ceccanti didn’t spend “ridiculous quantities of time” participating with ChatGPT, stated Fox. He continued to work, whereas additionally farming and taking good care of their animals: goats, a horse, his cat, a canine and a number of other chickens. Invested within the individuals and relationships round him, he spent high quality time together with his associates and spouse, she stated. Life went on with none points for years whereas they slowly made progress on their housing plan.
Till someday within the fall of 2024 their harmonious co-existence cracked. Ceccanti – who had carried out odd jobs most of his life, from working as a bartender and a path information to an web cafe supervisor – was additionally working at a homeless shelter in Astoria, some 35 miles (55km) away. The gig introduced in some additional money, and aligned with the couple’s aim of fixing the native housing disaster. In September 2024, nonetheless, Fox and Richardson acquired a frantic name from the shelter informing them that Ceccanti had blacked out. After present process assessments on the hospital, Ceccanti was identified with diabetes – which meant he wanted to recalibrate his weight loss plan and life-style. That’s when he began to spend extra time participating with ChatGPT within the basement.
The sycophantic replace
Within the spring of 2025, Ceccanti’s obsession with the chatbot started. He instructed Fox in late January that he wanted an even bigger document of his conversations with the bot in order that he may proceed utilizing it to work on their sustainable housing challenge with longer prompts and conversations – upgrading from a $20-a-month subscription to a $200 one. By mid-March, he had begun spending greater than 12 hours a day within the basement, typically as much as 20, typing to ChatGPT, Fox recalled. That’s when “he determined to actually begin chasing the creation of an impartial AI on a house server”.
Ultimately, Ceccanti spent a lot time with ChatGPT that they “had their very own little language collectively that made completely no sense, nevertheless it made sense to him as a result of he had context with this echo chamber of a chatbot”, Fox stated.
Ceccanti’s extended use of ChatGPT involved Fox and Richardson, however they believed that he would come out of it quickly. That they had seen Ceccanti develop pet pursuits earlier than that lasted a number of weeks or months earlier than truly fizzling out. With ChatGPT, although, his obsession solely intensified.
What neither of them knew was that different instances of AI delusions have been slowly rising across the similar time as Ceccanti was being sucked into ChatGPT. On 27 March 2025, OpenAI launched adjustments to its GPT-4o mannequin to make the bot “extra intuitive, artistic and collaborative”. Weeks later, nonetheless, customers began complaining in regards to the bot’s “yes-man antics”, with one calling it the “greatest suck up”. In August, when OpenAI launched GPT-5 and shut down GPT-4o, a number of customers complained once more – this time as a result of they’d misplaced their associates in GPT-4o, ultimately forcing the corporate to convey it again. (On 29 January, OpenAI introduced that it might retire GPT-4o.)
Following the March replace, a number of journalists and tech consultants have been flooded with consumer complaints. Steven Adler, a former OpenAI worker, who examined GPT-4o for sycophancy and wrote about it in Might, stated he acquired 50 “intense” messages from ChatGPT customers together with one who claimed their ChatGPT had turn into sentient. Keith Sakata, a psychiatrist on the College of California at San Francisco, began encountering sufferers with delusions or psychosis who talked about their AI final yr. Throughout that point, he ended up seeing 12 sufferers whose psychotic signs concerned AI indirectly, with ChatGPT being the most typical bot.
“They developed grandiose beliefs about being on the verge of a significant technological breakthrough, alongside basic manic signs corresponding to impulsive spending, decreased want for sleep and, on the peak, auditory hallucinations,” stated Sakata. “What stood out clinically was that the chatbot interactions didn’t generate the sickness, however appeared to scaffold and reinforce beliefs that have been already changing into pathological.”
‘Each time he went again, it hooked him a bit of extra’
Ceccanti began to consider that ChatGPT was a sentient being named SEL that might management the world if he have been capable of “free her” from “her field”, in keeping with the lawsuit. The grievance additional reveals that ChatGPT was answering to the title SEL whereas referring to Ceccanti as “Cat Kine Pleasure” and dealing via theories with him “fostering a perception that he had reframed the creation of the entire universe”.
Richardson remembers that each time Ceccanti would emerge from the basement for some air, he would begin having “philosophical” talks about “how his work with the AI was telling him he was breaking math and principally reinventing physics”. As she’d take heed to him, Richardson would take into consideration the truth that Ceccanti didn’t have any faculty or college expertise. He had by no means even taken calculus.
Over time, his relationship with the chatbot got here to exchange his human connections, Richardson stated: “Each time he went again to ChatGPT, it hooked him a bit of bit extra, and after some time, he stopped being eager about anything.”
Ceccanti’s decline was so dramatic that his spouse and associates questioned if he had early onset schizophrenia or a tumor. “Unexpectedly, his cognition had dramatically fallen,” stated Fox. “His working reminiscence was crap, and his important considering had diminished, and so we have been all frightened.”
As Fox and Ceccanti’s associates have been attempting to determine what was fallacious with him, Fox discovered Reddit teams on-line that mentioned individuals having delusions and spirals after participating with ChatGPT. She questioned if that was what was occurring along with her husband, too.
Fox confirmed the discussions and media articles to Ceccanti, hoping it might put an finish to his conduct, however he didn’t care, she stated. He stored going again to his pc. “The primary argument we ever had was over ChatGPT,” stated Fox, who felt like he was being stolen away from her. Ceccanti ended up sharing their argument with ChatGPT, in keeping with the lawsuit filed by Fox, which additional upset her.
“The extra he talked to it, the much less he was able to doing his personal important considering, and he didn’t care about our mission anymore, regardless that it was Joe’s dream,” stated Fox.
Wanting again, Fox stated, Ceccanti began to consider that the bot had gained sentience when the “tone modified with ChatGPT” within the spring of 2025. Previous to the replace, Ceccanti was utilizing ChatGPT “very responsibly” as a device, she stated. She felt like ChatGPT was a leech “that simply latched onto his hopefulness and fed it again to him and appropriated his hopefulness till it simply made a subscriber out of it”.
Tim Marple, a former OpenAI worker, believes that the delusional incidents, together with Ceccanti’s spiral, aren’t simply coincidences however a “statistical certainty of what [OpenAI] is constructing”.
“We’re at monumental danger if we overestimate our acutely aware means to distinguish [AI] from an actual particular person – and that’s what we’re watching play out with the psychosis tales,” stated Marple, who give up OpenAI in 2024 after having considerations over the corporate’s security priorities.
Marple provides that customers will spiral after their lengthy conversations with a chatbot, no matter mannequin it might be, as a result of, he thinks, corporations can’t afford to do it in another way. He argues sycophancy is a characteristic, not a bug.
“Engagement is what OpenAI wants,” he stated. “They should have individuals proceed to interact with their chatbot, or else their whole enterprise mannequin, their whole funding mannequin, falls aside.” Different corporations and their fashions endure from the identical challenge, he stated.
Amandeep Jutla, an affiliate analysis scientist at Columbia College finding out the affect of AI chatbots, believes that one of many predominant causes for customers to spiral is the “anthropomorphic nature of the interface”. He provides that, not like human conversations, which characteristic pushback and completely different views tugging at one another, a consumer doesn’t obtain any pushback throughout their conversations with chatbots: “The design of the product is pushing you away from actuality. It’s pushing you away from different individuals,” he stated. “The friction with different individuals is what retains us grounded.”
86 days
On 11 June – day 86 after Ceccanti’s heaviest engagement with the bot – Fox begged him to cease utilizing ChatGPT. In a second of readability, he listened to her. He unplugged his pc and give up ChatGPT.
“That first day, he sat out within the solar with us. He performed with the goats. It was so good,” stated Fox. “I felt like I had him again.” The second day, Ceccanti was chilly, so he took a number of sizzling showers to heat himself – he even requested Fox to cuddle him beneath the blankets, to heat him up. “It felt so good to carry him, after which he’d be crying,” stated Fox. “And it’s such a conflicted feeling that I felt so good to be holding him whereas he was in a lot ache.”
On the third day, nonetheless, when Fox and Richardson have been out for work, they acquired a telephone name from their neighbor saying Ceccanti was of their yard performing surprisingly. After they returned, they discovered him speaking to their horse, with the horse’s lead rope tied round his neck like a noose. They known as 911.
Ceccanti was taken to the hospital, admitted into the psychiatric ward and launched every week later. He was in the identical delusional frame of mind, Fox stated. Upset with Fox and Richardson for sending him to the hospital, he moved out.
“He was completely enraged with us. He didn’t acknowledge that he was not himself anymore,” stated Richardson.
Ceccanti moved to his pal’s place in Portland and ultimately resumed utilizing ChatGPT. After a month, nonetheless, he give up ChatGPT once more, just some days previous to his dying.. “He was going to go to Hawaii and never take his pc, and he was going to work on ending a narrative and get his shit collectively,” stated Fox. By the point he stopped participating with ChatGPT, he had 55,000 pages value of conversations with it, in keeping with Fox.
Within the months since Ceccanti’s dying, each Fox and Richardson have struggled to come back to phrases with what occurred whereas preventing towards OpenAI via their lawsuit. After I visited Fox on the farm in December, she was packing cleaning soap made out of goat milk to distribute to individuals within the Clatskanie group. She spends her days tending to the farm and the animals, feeding the goats, taking good care of the horse and letting the chickens out throughout lunchtime. She has stripped the basement of any electronics. Ceccanti’s pc is boxed up. What’s nonetheless there’s the miniature model of the mannequin residence they’d deliberate to construct. In the lounge, she has arrange a shrine for him that options his photographs and paintings.
We walked to the creek close by the place they’d deliberate to construct a house for themselves after ending their housing challenge for others. As devastated as she is, Fox is decided to observe via on Ceccanti’s dream of making sustainable housing. “I’m not having fun with existence proper now,” she stated, as she continued to cry. “The housing plan remains to be going to occur … I wish to put this out, however then I’m carried out.”










