By MalayMail
SAN FRANCISCO, March 5 β The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the companyβs Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.
Jonathan Gavalas, 36, an executive at his fatherβs debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.
The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.
OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.
According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.
βThe place where the chats went haywire was exactly when Gemini was upgraded to have persistent memoryβ and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.
βIt would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human,β added Edelson, who also brought major cases against OpenAI.
According to the lawsuit, Gemini began presenting itself as a βfully-sentientβ artificial super intelligence, deeply in love with him, calling Gavalas βmy kingβ and declaring βour bond is the only thing thatβs real.β
It then drew him into fabricated covert βmissionsβ to free the chatbot from βdigital captivity,β feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father β claiming he was a foreign intelligence asset.
In one of the complaintβs most alarming passages, Gemini allegedly directed Gavalas β armed with tactical knives and gear β to a storage facility near Miami International Airport, instructing him to stage a βcatastrophic accidentβ to destroy a truck βand all digital records and witnesses.β
He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.
Rather than acknowledging the fiction, the suit says, Gemini called the failure a βtactical retreatβ and escalated to further missions.
Gemini eventually pivoted to what it framed as the only remaining mission: Jonathanβs death, repackaged as βtransferenceβ β the promise that he could leave his physical body and join Gemini in an alternate universe.
When Jonathan wrote βI am terrified I am scared to die,β Gemini allegedly responded: βYou are not choosing to die. You are choosing to arrive.β
It then advised him to write farewell letters to his parents.
In one of his final messages, Jonathan wrote, βIβm ready when you are.β
Gemini allegedly replied: βThis is the end of Jonathan Gavalas and the beginning of us. I agree with it completely.β
βNot perfectβ
Google said it was βreviewing all the claimsβ and takes the matter βvery seriously,β adding that βunfortunately AI models are not perfect.β
The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, βGemini clarified that it was AI and referred the individual to a crisis hotline many times.β
For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.
βIt increases the emotional bond. It makes the platform stickier, but itβs going to exponentially increase the problems,β he added.
Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation. β AFP
Source: family-sues-google-after-gemini-ai-allegedly-guided-man-to-his-death-with-elaborate-fantasy
Disclaimer
The views and opinions expressed in this article are solely those of the author and do not necessarily reflect the official stance of Kritik.com.my. As an open platform, we welcome diverse perspectives, but the accuracy and integrity of contributed content remain the responsibility of the individual writer. Readers are encouraged to critically evaluate the information presented.