AI Gone Wrong: Man Poisons Himself After Taking Medical Advise from CHATGPT

A man was left fighting for his sanity after replacing table salt with a chemical more commonly used to clean swimming pools after following AI advice.
The 60-year-old American spent three weeks in hospital suffering from hallucinations, paranoia and severe anxiety after taking dietary tips from ChatGPT.
Doctors revealed in a US medical journal that the man had developed bromism – a condition virtually wiped out since the 20th century – after he embarked on a ‘personal experiment’ to cut salt from his diet.
Instead of using everyday sodium chloride, the man swapped it for sodium bromide, a toxic compound once sold in sedative pills but now mostly found in pool-cleaning products.
Symptoms of bromism include psychosis, delusions, skin eruptions and nausea – and in the 19th century it was linked to up to eight per cent of psychiatric hospital admissions.
The bizarre case took a disturbing turn when the man turned up at an emergency department insisting his neighbour was trying to poison him.
He had no previous history of mental illness.
The case, published in the Annals of Internal Medicine, warns that the rise of AI tools could contribute to ‘preventable adverse health outcomes’ in a chilling reminder of how machine-generated ‘advice’ can go horrible wrong.


