One guy’s attempt to use ChatGPT to improve his diet landed him in the hospital when it recommended that he pull his ingredients from a Victorian-era medicine cabinet.
According to a case study published in the Annals of Internal Medicine, a 60-year-old man wanted to cut sodium chloride, aka table salt, from his diet. And as so many do nowadays, he turned to ChatGPT for alternatives. The AI chatbot suggested sodium bromide, a substance typically found in swimming pools.
The man dutifully followed the advice. Not long after, he developed symptoms like confusion, hallucinations, and eventually, full-blown psychosis. Doctors diagnosed him with bromism, a condition so rare today that it might as well wear a top hat and say things like “Pray, good sir, if I may be so bold…”
Back in the late 1800s and early 1900s, bromide salts were all the rage for treating everything from headaches to general Victorian ennui. But the problem with bromides is that they build up in the bloodstream, leading to serious neuropsychiatric issues. At one point, bromism accounted for up to 8 percent of psychiatric hospitalizations, until regulations in the 1970s phased them out of public life.
Despite this, when 404 Media tried to replicate the man’s queries, ChatGPT continued to recommend sodium bromide as a substitute for chloride, without a single warning about toxicity, or even a gentle suggestion to not ingest pool chemicals.
This all likely happened on an earlier version of ChatGPT, but still, it’s yet another example of how AI chatbots are integrated with nuance and context. Even when, during testing, it specifically asked for more context, it still just lumped all sodiums together, thinking they’re interchangeable, not realizing that one might be better suited for flavoring a chicken breast while another is better for driving yourself insane.
The post Man Asked ChatGPT for Diet Advice and Ended Up With Victorian-Era Psychosis appeared first on VICE.