Man Hospitalised After Replacing Salt With Chemical Suggested by AI Chatbot

Man Hospitalised After Replacing Salt With Chemical Suggested by AI Chatbot

A 60-year-old man was hospitalised with severe psychiatric symptoms after reportedly following a ChatGPT-generated diet plan that advised replacing table salt (sodium chloride) with sodium bromide, a potentially dangerous industrial chemical.

According to a report published in the American College of Physicians Journal, the unnamed man had no prior psychiatric or medical history. He had studied nutrition in college and was conducting a personal experiment to improve his health by eliminating common salt from his diet after learning about its adverse effects.

In search of alternatives, the man turned to ChatGPT for guidance. The chatbot allegedly recommended sodium bromide, a compound that resembles salt in appearance but is chemically different and toxic in excess. While sodium bromide is occasionally used in medicine, it is more commonly found in industrial and cleaning products. Excessive ingestion can cause neurological, psychiatric, and dermatologic issues.

Over the course of three months, the man swapped out salt for sodium bromide, purchased online, and imposed extreme dietary restrictions, even diluting his own water intake. By the time he was admitted to a hospital, he was experiencing intense thirst, hallucinations, paranoia, and motor coordination issues.

Doctors initially treated him with fluids, electrolytes, and antipsychotics, and later moved him to an inpatient psychiatric unit after he attempted to escape. He spent three weeks in the hospital before being discharged in stable condition.

The incident highlights growing concerns over AI-generated health advice. The case study warns that AI systems like ChatGPT can provide scientifically inaccurate or misleading information, and that users must avoid treating such tools as substitutes for medical or professional advice.

OpenAI, the developer of ChatGPT, clearly states in its terms of use that the service “may not always be accurate” and is not intended to diagnose or treat health conditions.

Leave a Reply

Your email address will not be published.