Hospital records show that a man in his sixties ended up hospitalised with neurological and psychiatric symptoms after replacing table salt with sodium bromide, based on AI-generated advice from ChatGPT. The condition, known as bromism, includes paranoia, hallucinations and coordination issues.
Medical staff noted unusual thirst and paranoia around drinking water. Shortly after admission, the patient experienced auditory and visual hallucinations and was placed under an involuntary psychiatric hold due to grave disability.
The incident underscores the serious risks of relying on AI tools for health guidance. In this case, ChatGPT did not issue warnings or ask for medical context when recommending sodium bromide, a toxic alternative.
Experts stress that AI should never replace professional healthcare consultation, particularly for complex or rare conditions.