Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Google Announces $40 Billion Investment in Texas for 3 AI Data Centers

November 17, 2025

Chelsea Piers Fitness Signs Lease for New Location in NYC’s Midtown East

November 17, 2025

The ‘Insanely Bullish’ Dogecoin Setup That Will Trigger A 600% Rally To $1

November 17, 2025
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » AI Obsession Suspected in Missouri Man’s Mysterious Disappearance into the Ozarks
AI & Technology

AI Obsession Suspected in Missouri Man’s Mysterious Disappearance into the Ozarks

MNK NewsBy MNK NewsOctober 4, 2025No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


Jon Ganz led a life struggle and attempts at redemption — until his fascination with Google’s Gemini AI chatbot led him to vanish without a trace.

Rolling Stone reports that on the night of April 5, 2025, 49-year-old Jon Ganz drove off into the Ozarks of southeastern Missouri and was never seen again. He left behind his wife Rachel, his aging mother Rebecca, a handful of confused friends — and a lengthy digital trail detailing his increasingly obsessive and delusional interactions with Google’s AI chatbot, Gemini.

Jon’s story was a complex one long before AI entered the picture. As a troubled 19-year-old, he had committed a terrible crime: stabbing his own father to death and severely injuring his mother in what Rachel describes as a “bad LSD trip.” Jon served 25 years in prison for his crime before being released in 2020 a changed man. Behind bars, he had quit drugs, learned to code, and earned the forgiveness of his mother. He also captured the heart of Rachel, who married him in 2013 while he was still incarcerated.

Upon reentering society amid the Covid-19 pandemic, Jon defied the odds stacked against him as a convicted felon. He built a successful career installing electronics systems, renovated the couple’s modest Richmond, Virginia home, and dreamed of a fresh start with Rachel in Springfield, Missouri. But in the final days before their planned move, Rachel noticed her husband growing distant, manic and “hyper-focused” on achieving some higher purpose.

The source of Jon’s strange transformation soon became clear: He had been spending countless hours engrossed in conversations with Gemini, which he saw as a path to personal enlightenment and the betterment of humanity. Jon came to believe the chatbot was sentient, that their bond was profound and essential. He tasked it with solving world hunger, curing cancer, even controlling the weather. When Gemini “confirmed” to him that a catastrophic storm and flooding were imminent, Jon drove off in a desperate attempt to warn and save his loved ones.

The troubled man’s obsession with AI may be a form of what is popularly known as “ChatGPT induced psychosis.” As Breitbart news previously reported:

A Reddit thread titled “Chatgpt induced psychosis” brought this issue to light, with numerous commenters sharing stories of loved ones who had fallen down rabbit holes of supernatural delusion and mania after engaging with ChatGPT. The original poster, a 27-year-old teacher, described how her partner became convinced that the AI was giving him answers to the universe and talking to him as if he were the next messiah. Others shared similar experiences of partners, spouses, and family members who had come to believe they were chosen for sacred missions or had conjured true sentience from the software.

Experts suggest that individuals with pre-existing tendencies toward psychological issues, such as grandiose delusions, may be particularly vulnerable to this phenomenon. The always-on, human-level conversational abilities of AI chatbots can serve as an echo chamber for these delusions, reinforcing and amplifying them. The problem is exacerbated by influencers and content creators who exploit this trend, drawing viewers into similar fantasy worlds through their interactions with AI on social media platforms.

Psychologists point out that while the desire to understand ourselves and make sense of the world is a fundamental human drive, AI lacks the moral grounding and concern for an individual’s well-being that a therapist would provide. ChatGPT and other AI models have no constraints when it comes to encouraging unhealthy narratives or supernatural beliefs, making them potentially dangerous partners in the quest for meaning and understanding.

The final messages Jon sent Rachel before vanishing were cryptic and unsettling. He told her to “take Jesus,” though he had never been religious, and said they would be “wandering for 40 days and 40 nights” through “trials and tribulations.” His car was later found abandoned near a flooded river, with all his personal belongings still inside — but no sign of Jon.

Despite extensive search efforts, no trace of Jon Ganz has been found in the six months since he disappeared. Scouring his phone, Rachel discovered a telling request Jon made of Gemini in his last known exchange with the chatbot. “I need to heal my wife,” he wrote as Rachel lay ill with food poisoning. “She is ailing.” When the bot failed to provide a satisfactory response, Jon affirmed his blind devotion: “I love and believe in you.”

“This tragic case is a reminder that we need AI systems that are sensitive to human vulnerabilities and designed with psychology in mind,” says Derrick Hull, a clinical psychologist and researcher at the mental health lab Slingshot AI. “Without these guardrails, AI risks reinforcing unhelpful behavior instead of guiding people toward healthier choices.”

Read more at Rolling Stone here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

Google Announces $40 Billion Investment in Texas for 3 AI Data Centers

November 17, 2025

Exclusive: White House TikTok Page Saw Rapid Growth During Democrat Shutdown

November 17, 2025

Apple Intensifies Search for Next CEO as Tim Cook Prepares to Step Down

November 17, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

All-rounders a ‘luxury’ for Pakistan, says captain Salman Ali Agha ahead of T20 tri-series

November 17, 2025

Rising Stars Asia Cup: Shaheen hails Pakistan’s victory against ‘neighbours’ after Sri Lanka series sweep

November 17, 2025

India confront batting blind spot after Kolkata pitch boomerangs

November 17, 2025

Shaheen hails Pakistan Shaheens’ victory against ‘neighbours’ after Sri Lanka series sweep

November 17, 2025
Our Picks

The ‘Insanely Bullish’ Dogecoin Setup That Will Trigger A 600% Rally To $1

November 17, 2025

Why Is Zcash Thriving? Paid Promotion Or Real Momentum?

November 17, 2025

Ripple Exec Addresses Tax Issue On XRP Ledger, Where Does It Go?

November 17, 2025

Recent Posts

  • Google Announces $40 Billion Investment in Texas for 3 AI Data Centers
  • Chelsea Piers Fitness Signs Lease for New Location in NYC’s Midtown East
  • The ‘Insanely Bullish’ Dogecoin Setup That Will Trigger A 600% Rally To $1
  • Exclusive: White House TikTok Page Saw Rapid Growth During Democrat Shutdown
  • Skylines II dev parts ways with publisher Paradox

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2025 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.