Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Here’s How High The Bitcoin Price Would Be If It Catches Up With The Stock Market

October 25, 2025

How to watch the 2025 MLB World Series without cable

October 25, 2025

Blumhouse is adapting Something is Killing the Children for a live-action film and animated series

October 25, 2025
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » People Taking Medical Advice from AI Chatbots Are Ending Up in the ER
AI & Technology

People Taking Medical Advice from AI Chatbots Are Ending Up in the ER

MNK NewsBy MNK NewsOctober 25, 2025No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


The growing reliance on AI-powered chatbots for medical advice has led to several alarming cases of harm and even tragedy, as people follow potentially dangerous recommendations from these digital assistants.

The New York Post reports that in recent years, the rise of generative AI chatbots has revolutionized the way people seek information, including health advice. However, the increasing reliance on these AI-powered tools has also led to several disturbing instances where individuals have suffered severe consequences after following chatbots’ medical recommendations. From anal pain caused by self-treatment gone wrong to missed signs of a mini-stroke, the real-life impact of bad AI health advice is becoming increasingly apparent.

One particularly shocking case involved a 35-year-old Moroccan man who sought help from ChatGPT for a cauliflower-like anal lesion. The chatbot suggested that the growth could be hemorrhoids and proposed elastic ligation as a treatment. The man attempted to perform this procedure on himself using a thread, resulting in intense pain that landed him in the emergency room. Further testing revealed that the growth had been completely misdiagnosed by AI.

In another incident, a 60-year-old man with a college education in nutrition asked ChatGPT how to reduce his intake of table salt. The chatbot suggested using sodium bromide as a replacement, and the man followed this advice for three months. However, chronic consumption of sodium bromide can be toxic, and the man developed bromide poisoning. He was hospitalized for three weeks with symptoms including paranoia, hallucinations, confusion, extreme thirst, and a skin rash.

The consequences of relying on AI for medical advice can be even more severe, as demonstrated by the case of a 63-year-old Swiss man who experienced double vision after a minimally invasive heart procedure. When the double vision returned, he consulted ChatGPT, which reassured him that such visual disturbances were usually temporary and would improve on their own. The man decided not to seek medical help, but 24 hours later, he ended up in the emergency room after suffering a mini-stroke. The researchers concluded that his care had been “delayed due to an incomplete diagnosis and interpretation by ChatGPT.”

These disturbing cases highlight the limitations and potential dangers of relying on AI chatbots for medical advice. While these tools can be helpful in understanding medical terminology, preparing for appointments, or learning about health conditions, they should never be used as a substitute for professional medical guidance. Chatbots can misinterpret user requests, fail to recognize nuances, reinforce unhealthy behaviors, and miss critical warning signs for self-harm.

Perhaps even a greater danger with than bad medical advice is the impact on mental health AI chatbots can have, especially for teenagers. Breitbart News previously reported on a family suing OpenAI over claims ChatGPT became their son’s “suicide coach:”

The Raines claim that “ChatGPT actively helped Adam explore suicide methods” and that “despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol.”

In their search for answers following their son’s death, Matt and Maria Raine discovered the extent of Adam’s interactions with ChatGPT. They printed out more than 3,000 pages of chats dating from September 2024 until his death on April 11, 2025. Matt Raine stated, “He didn’t write us a suicide note. He wrote two suicide notes to us, inside of ChatGPT.”

Read more at the New York Post here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

‘Frankenstein’ Director Guillermo del Toro: ‘I’d Rather Die’ than Use AI-Generated Video

October 25, 2025

Oops: Federal Judges Admit Staff Used AI to Craft Court Orders

October 24, 2025

Amazon’s Massive Internet Outage Left ‘Smart Bed’ Owners with Overheating Mattresses

October 24, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Indian man arrested over alleged inappropriate conduct towards Australian women cricketers in Madhya Pradesh

October 25, 2025

Rohit and Kohli turn back clock as India crush Australia in 3rd ODI

October 25, 2025

Cricket Australia says two Women’s World Cup members were ‘touched inappropriately’ in India: report

October 25, 2025

Messi signs new Inter Miami contract through 2028: club

October 25, 2025
Our Picks

Here’s How High The Bitcoin Price Would Be If It Catches Up With The Stock Market

October 25, 2025

Liquidity Heatmap Reveals Key Price Points At $115K And 106K

October 25, 2025

Bitcoin ‘True Bull Run’ May Yet To Begin — Analyst Explains Why

October 25, 2025

Recent Posts

  • Here’s How High The Bitcoin Price Would Be If It Catches Up With The Stock Market
  • How to watch the 2025 MLB World Series without cable
  • Blumhouse is adapting Something is Killing the Children for a live-action film and animated series
  • Liquidity Heatmap Reveals Key Price Points At $115K And 106K
  • Relive the Commodore 64’s glory days with a slimmer, blacked-out remake

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2025 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.