SEARCH

    Saved articles

    You have not yet added any article to your bookmarks!

    Browse articles
    Select News Languages

    GDPR Compliance

    We use cookies to ensure you get the best experience on our website. By continuing to use our site, you accept our use of cookies, Privacy Policies, and Terms of Service.

    newshunt
    newshunt

    Man Asks ChatGPT To Suggest Replacement For Table Salt In Diet, Lands In Hospital

    1 week ago

    A 60-year-old man developed severe psychiatric and physical symptoms after replacing table salt with a toxic industrial chemical on the advice of ChatGPT, a New York Post article said citing a recent medical case study.

    Concerned about the health risks of sodium chloride (table salt), the man asked ChatGPT for a substitute and was told to try sodium bromide. While chemically similar in appearance, sodium bromide is primarily used in industrial and cleaning processes and can be dangerous when consumed in significant quantities.

    Believing it to be a healthier alternative, the man, who had studied nutrition in college, purchased sodium bromide online and removed table salt from his diet. Over three months, he began experiencing extreme thirst, coordination issues, and growing paranoia.

    Hospitalisation And Diagnosis

    He was admitted to a hospital after suspecting his neighbour was poisoning him. Despite having no psychiatric history, he developed auditory and visual hallucinations within 24 hours and attempted to escape medical care. He was later moved to the inpatient psychiatric unit.

    Doctors diagnosed him with bromism, a toxic syndrome caused by overexposure to bromide, after he also reported fatigue, acne, insomnia, ataxia (loss of muscle coordination), and polydipsia (extreme thirst). Treatment with fluids, electrolytes, and antipsychotics led to gradual improvement.

    Warnings About AI Health Advice

    The case, published in Annals of Internal Medicine Clinical Cases, highlights the dangers of relying on AI for medical decisions. The authors cautioned that AI systems can produce inaccurate information and lack the ability to critically evaluate results.

    According to a 2025 survey, 35% of Americans already use AI for health guidance, with many finding it easier and more accessible than consulting medical professionals. Mental health experts have also warned of "ChatGPT psychosis," where intense chatbot interaction can worsen psychological distress.

    Click here to Read more
    Prev Article
    ‘Humans Betray, We Don’t’: AI Delhi Stray Dog’s Plea Goes Viral After SC's Relocation Order — WATCH
    Next Article
    Introducing Rahul Gandhi As 'Batman': Congress's 'I'm Just In Their Way' Dig At EC Goes Viral. WATCH VIDEO

    Related Trending Updates:

    Comments (0)

      Leave a Comment