BillionaireClubCollc
  • News
  • Notifications
  • Shop
  • Cart
  • Media
  • Advertise with Us
  • Profile
  • Groups
  • Games
  • My Story
  • Chat
  • Contact Us
home shop notifications more
Signin
  •  Profile
  •  Sign Out
Skip to content

Billionaire Club Co LLC

Believe It and You Will Achieve It

Primary Menu
  • Home
  • Politics
  • TSR
  • Anime
  • Michael Jordan vs.Lebron James
  • Crypto
  • Soccer
  • Dating
  • Airplanes
  • Forex
  • Tax
  • New Movies Coming Soon
  • Games
  • CRYPTO INSURANCE
  • Sport
  • MEMES
  • K-POP
  • AI
  • The Bahamas
  • Digital NoMad
  • Joke of the Day
  • RapVerse
  • Stocks
  • SPORTS BETTING
  • Glamour
  • Beauty
  • Travel
  • Celebrity Net Worth
  • TMZ
  • Lotto
  • COVD-19
  • Fitness
  • The Bible is REAL
  • OutDoor Activity
  • Lifestyle
  • Culture
  • Boxing
  • Food
  • LGBTQ
  • Poetry
  • Music
  • Misc
  • Open Source
  • NASA
  • Science
  • Natural & Holstict Med
  • Gardening
  • DYI
  • History
  • Art
  • Education
  • Pets
  • Aliens
  • Astrology
  • Farming and LiveStock
  • LAW
  • Fast & Furious
  • Fishing & Hunting
  • Health
  • Credit Repair
  • Grants
  • All things legal
  • Reality TV
  • Africa Today
  • China Today
  • "DUMB SHIT.."
  • Politics

The dangerous ChatGPT advice that landed a 60-year-old man in the hospital with hallucinations

By Reda Wigle

Consulting AI for medical advice can have deadly consequences.

A 60-year-old man was hospitalized with severe psychiatric symptoms — plus some physical ones too, including intense thirst and coordination issues — after asking ChatGPT for tips on how to improve his diet.

What he thought was a healthy swap ended in a toxic reaction so severe that doctors put him on an involuntary psychiatric hold.

Smartphone displaying the ChatGPT logo.
After reading about the adverse effects of sodium chloride, or table salt, on overall health, the man consulted ChatGPT and was told that chloride can be swapped with bromide.
Getty Images
After reading about the adverse health effects table salt — which has the chemical name sodium chloride — the unidentified man consulted ChatGPT and was told that it could be swapped with sodium bromide.

Sodium bromide looks similar to table salt, but it’s an entirely different compound. While it’s occasionally used in medicine, it’s most commonly used for industrial and cleaning purposes — which is what experts believe ChatGPT was referring to.

Having studied nutrition in college, the man was inspired to conduct an experiment in which he eliminated sodium chloride from his diet and replaced it with sodium bromide he purchased online.

Explore More
An image collage containing 3 images, Image 1 shows Michael Bolton performs on stage in "The Wonderful World of Disney: Magical Holiday Celebration," which was filmed in November 2023. He was diagnosed with glioblastoma the following month, Image 2 shows Brain scans showing tumor resolution over time, Image 3 shows Headshot of Georges, Dr. PhD, wearing a white lab coat and glasses
Beware these 5 signs of a 'highly invasive' brain cancer — patients typically survive only 15 months
An image collage containing 3 images, Image 1 shows James Van Der Beek at the premiere of Amazon Prime Video's "Overcompensating.", Image 2 shows James Van Der Beek at the Chrysalis Butterfly Ball, Image 3 shows Doctor holding a colon model
James Van Der Beek reveals the minor colorectal cancer symptom he dismissed before his diagnosis
An image collage containing 3 images, Image 1 shows Brandon Blackstock attends the 25th Annual Critics' Choice Awards at Barker Hangar on January 12, 2020 in Santa Monica, California, Image 2 shows Brandon Blackstock and Kelly Clarkson attend Billboard's Women in Music Awards, Image 3 shows Dermatologist examining moles on a patient's back
What to know about dangerous 'black tumor' cancer as Kelly Clarkson's ex-husband Brandon Blackstock dies at 48
He was admitted to the hospital after three months of the diet swap, amid concerns that his neighbor was poisoning him.

The patient told doctors that he distilled his own water and adhered to multiple dietary restrictions. He complained of thirst but was suspicious when water was offered to him.

Though he had no previous psychiatric history, after 24 hours of hospitalization, he became increasingly paranoid and reported both auditory and visual hallucinations.

He was treated with fluids, electrolytes and antipsychotics and — after attempting escape — was eventually admitted to the hospital’s inpatient psychiatry unit.

Publishing the case study last week in the journal Annals of Internal Medicine Clinical Cases, the authors explained that the man was suffering from bromism, a toxic syndrome triggered by overexposure to the chemical compound bromide or its close cousin bromine.

Doctor holding the hand of a senior man in a hospital bed.
The man was suffering from bromism, a toxic syndrome triggered by overexposure to the chemical compound bromide or its close cousin bromine.
Halfpoint – stock.adobe.com
When his condition improved, he was able to report other symptoms like acne, cherry angiomas, fatigue, insomnia, ataxia (a neurological condition that causes a lack of muscle coordination), and polydipsia (extreme thirst), all of which are in keeping with bromide toxicity.

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” study authors warned.

See Also

ChatGPT offered step-by-step instructions for self-harm, devil worship and ritual bloodletting, disturbing report reveals
In the Terms of Use,

OpenAI, the developer of ChatGPT, states in its terms of use that the AI is “not intended for use in the diagnosis or treatment of any health condition” — but that doesn’t seem to be deterring Americans on the hunt for accessible healthcare.

According to a 2025 survey, a little more than a third (35%) of Americans already use AI to learn about and manage aspects of their health and wellness.

Though relatively new, trust in AI is fairly high, with 63% finding it trustworthy for health information and guidance—scoring higher in this area than social media (43%) and influencers (41%), but lower than doctors (93%) and even friends (82%).

Americans also find that it’s easier to ask AI specific questions versus going to a search engine (31%) and that it’s more accessible than speaking to a health professional (27%).

Recently, mental health experts have sounded the alarm about a growing phenomenon known as “ChatGPT psychosis” or “AI psychosis,” where deep engagement with chatbots fuels severe psychological distress.

Reports of dangerous behavior stemming from interactions with chatbots have prompted companies like OpenAI to implement mental health protections for users.

“While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information,” the report authors concluded.

“It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.”

Welcome to Billionaire Club Co LLC, your gateway to a brand-new social media experience! Sign up today and dive into over 10,000 fresh daily articles and videos curated just for your enjoyment. Enjoy the ad free experience, unlimited content interactions, and get that coveted blue check verification—all for just $1 a month!

Source link

Share
What's your thought on the article, write a comment
0 Comments
×

Sign In to perform this Activity

Sign in
×

Account Frozen

Your account is frozen. You can still view content but cannot interact with it.

Please go to your settings to update your account status.

Open Profile Settings

Ads

  • Original Billionaire128 Old School Bucket Hat

    $ 28.50
  • Billionaire128 Liquid Gold Flag

    $ 25.00
  • Billionaire128 Liquid Gold Backpack

    $ 44.50
  • News Social

    • Facebook
    • Twitter
    • Facebook
    • Twitter
    Copyright © 2024 Billionaire Club Co LLC. All rights reserved