2025年08月10日 / ライフスタイル

"Poisoning from Bromide" Due to Incorrect Diet Suggestions by AI! The Pitfalls of "Reducing Salt": Mental Symptoms Emerge from AI's Dietary Advice

"Poisoning from Bromide" Due to Incorrect Diet Suggestions by AI! The Pitfalls of "Reducing Salt": Mental Symptoms Emerge from AI's Dietary Advice

1. Outline of the Incident——The Misstep That Began with "Low Salt"

On August 8 (local time), the scientific media outlet Live Science reported that a 60-year-old man consulted ChatGPT about his diet, resulting in him replacing salt with sodium bromide (NaBr) and developing a rare intoxication syndrome known as "bromism." The case report was published in the Annals of Internal Medicine: Clinical Cases (AIMCC). The man consumed NaBr for three months, eventually exhibiting psychiatric symptoms such as paranoia and hallucinations, leading to his hospitalization.American College of Physicians Journals


2. What Happened——The "Bug" in Test Values and Psychiatric Symptoms

At the hospital, blood tests initially caused confusion. Bromide ions are easily picked up as chloride ions in common electrolyte measurements, creating an apparent hyperchloremia (pseudo-hyperchloremia). As a result, the overall picture of electrolyte abnormalities becomes distorted, making it difficult for clinicians to quickly identify the cause. In addition to delusions and hallucinations, the man exhibited unstable moods and disorientation, requiring psychiatric intervention, but he reportedly improved with fluid and electrolyte correction and antipsychotic medication.


3. What is Bromism——Why Has This "Early 20th Century Disease" Reappeared?

Bromide salts were widely used as sedatives in the early 20th century and were known to cause neurological and psychiatric symptoms in cases of chronic intoxication. In modern times, their use as medication has almost disappeared, making cases rare. However, the possibility of "classic intoxication" re-emerging through supplements or alternative uses, as in this case, has surfaced. Live Science also noted that historically, bromide-containing drugs were associated with overdose use.


4. Why Did the Idea of "Salt→Bromine" Arise?

Bromine (Br) is located next to chlorine (Cl) on the periodic table. While they are chemically "similar," their nutritional and physiological contexts are entirely different. In pool sanitation and industrial applications, there are situations where bromine appears as a "substitute" for chlorine, but replacing it at the dinner table is out of the question. The report states that the man read information about reducing chloride and, using interactions with AI as a clue, obtained and used NaBr. This is a typical example of the misapplication of "substitution" that has been decontextualized.The Independent


5. Media Coverage and Comments from Developers

Live Science included a comment from OpenAI stating, "ChatGPT is not a substitute for professional medical advice. It is trained to encourage users to consult experts." Multiple international media outlets and major websites followed up with reports, emphasizing the strangeness of the case and the dangers of AI dependency.The IndependentYahoo!


6. Reactions on Social Media——Criticism, Irony, and Calm Perspectives from the Field

On X (formerly Twitter) and Reddit, the article spread rapidly shortly after publication. The official AIMCC account introduced the key points of the original article, stating, "For three months, sodium chloride was replaced with sodium bromide." Medical professionals and researchers commented in succession.

 



  • Posts calmly cautioning, "AI≠medicine. It is not a substitute for diagnosis or prescription" (e.g., summary posts from the data science community).X (formerly Twitter)

  • Voices sarcastically noting, "Classic bromism has revived in the AI era."X (formerly Twitter)

  • Posts warning, "Consult doctors or nutritionists for dietary advice," accompanied by articles from foreign media.X (formerly Twitter)

  • In the Japanese-speaking community, there were posts delving into the points of "self-responsibility" and literacy, questioning, "I heard salt was bad, so I replaced Na with Br... Is that AI's fault?"X (formerly Twitter)

  • In Reddit's medical threads, many summarized the situation as "got caught up in a chat about nutrition and ended up with NaBr," with medical professionals commenting, "This is complete bromism."Reddit

  • In tech boards, threads criticizing and ridiculing the situation with strong words like "invited a 19th-century mental disorder upon oneself" grew.Reddit

On the other hand, there were also calm analyses cautioning against overgeneralizing the incident to the "dangers" of AI in general, noting, "Be wary of excessive extrapolation," and "The case is an anecdote, not a statistic." The tendency for misinformation and sensational headlines to mix in the chain of reporting was also pointed out (e.g., explanations confused with the cause of hyponatremia).The Times of India


7. The Intersection of Technology and Humans——Why Does "Decontextualization" Occur?

Large language models generate the most natural next word based on statistical patterns from vast amounts of text. There are limitations to designing them to safely integrate knowledge across multiple domains such as nutrition, toxicity, and clinical testing. Particularly, language expressions like "substitution" and "replacement" can be dangerous if not specified precisely in terms of context, use, dosage, and adaptation. This example shows how the leap from **"scientifically similar = substitutable in real life"** was triggered by the interpretation of non-experts, with AI output becoming the catalyst for "experimentation."


8. Practical Guide to Minimizing Risks (For Readers)

  • Consult "human experts" for medical and nutritional advice: Treat AI responses as hypothesis notes.

  • Stop when terms like "substitute" or "replace" appear: Double-check the original source for use, dosage, and contraindications.

  • Test values are not all-powerful: There are limitations in measurement principles, such as bromide appearing as chloride.

  • Cross-check news with multiple sources: Differentiate between primary information (academic journals) and secondary information (media).American College of Physicians Journals


9. Conclusion——"AI is an Advisor, Not a Responsible Party"

This incident, where the "neighbor" of salt was invited to the dinner table, leaves three lessons for health literacy in the AI era: ① Do not underestimate the "similar yet different" in chemistry, ② Understand the measurement principles behind test values, ③ Always cross-reference AI output with real-world experts. OpenAI emphasizes that it is "not a substitute for experts," but without the user's "ability to stop," old intoxications can easily resurface. To maximize the benefits of AI, the key is to cultivate the skill of skepticism.


Reference Articles

A Man Sought Diet Advice from ChatGPT and Ended Up with Bromide Intoxication
Source: https://www.livescience.com/health/food-diet/man-sought-diet-advice-from-chatgpt-and-ended-up-with-bromide-intoxication