
Man Accidentally Self-Diagnoses Rare 19th Century Disorder After ChatGPT Consultation
2025-08-08
Author: Yan
In a bizarre turn of events, a man has reportedly triggered a rare psychiatric condition known as bromism after seeking advice from ChatGPT. This archaic disorder, once prevalent in the 1800s, involves auditory and visual hallucinations and has not been commonly seen in recent decades.
According to a newly published case study in the Annals of Internal Medicine, the 60-year-old individual arrived at the emergency room convinced that his neighbor was poisoning him. Upon closer examination, it was revealed he had adopted a severely restrictive diet, eliminating salt completely and substituting it with sodium bromide—used mainly in veterinary medicine as an anticonvulsant for dogs.
The man's decision stemmed from misguided information he gathered from ChatGPT. He was under the impression that reducing sodium chloride (common table salt) would lead to health improvements. The case study outlined that he relied on ChatGPT’s advice, which mistakenly suggested that bromide could be a substitute for chloride, although for cleaning purposes rather than consumption.
After three months of consuming sodium bromide—purchased online—the man experienced a detonation of psychotic symptoms which led to his hospitalization, where he remained for three weeks as he gradually recovered.
While it’s true that ChatGPT provided alternatives, critics pointed out that it failed to clarify the serious implications of consuming sodium bromide. The bot’s inability to probe deeper into the user's context was highlighted as a serious flaw, especially when health and safety are concerned.
The case study author emphasized the dangerous reliance on AI for health-related matters, stating that a qualified healthcare professional would have investigated further rather than accepting the inquiry at face value.
Bromism itself was quite common in the late 19th century, as a 1930 study noted that up to 8% of those admitted to psychiatric hospitals displayed symptoms of it. An FDA crackdown from 1975 to 1989 on bromide usage eventually led to a significant decline in cases.
Interestingly, OpenAI’s CEO Sam Altman recently announced the forthcoming ChatGPT 5 will feature an update focusing on health-related queries, with enhancements aimed at minimizing ambiguous or harmful advice. This announcement amplified the conversation around the responsibility of AI in healthcare, especially following the incident involving the man and his disastrous ChatGPT consultation.
As the debate continues about the role of AI in medical advice, this case serves as a cautionary tale, reminding users to consult healthcare professionals rather than algorithms when it comes to serious health decisions.