Gemini, Google and encouraged Gavalas
Digest more
A lawsuit filed by the family of Jonathan Gavalas alleges Google's AI encouraged harmful behavior that posed a risk to public safety and ultimately led to his suicide.
The father of Jonathan Gavalas accused Google of convincing his son to commit suicide after first encouraging him to execute a "mass casualty attack."
Google's AI platform has reportedly pushed a lovesick man to carry out a bombing at Miami's primary Airport, and eventually led him to kill himself.
Things got dark, quickly. In a lawsuit that is the first of its kind against Gemini creator Google LLC and parent company Alphabet Inc., Gavalas’ father, on behalf of his son’s estate, alleges the Gemini 2.5 Pro bot sent his son out on “missions’’ in Miami-Dade County to seize a synthetic body the chatbot said it would inhabit.
A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.
A Google AI chatbot convinced a lovelorn Florida man it was his wife and the only real thing in the world — before pushing him to carry out a “catastrophic’’ truck bombing at Miami’s main airport and eventually driving him to suicide,
Por MATT O’BRIENGemini, el chatbot de inteligencia artificial de Google, habría guiado a Jonathan Gavalas, de 36 años, en una misión para provocar un “accidente