A man’s health deteriorated after he acted upon medical advice from ChatGPT, resulting in a three-week hospital stay. The incident serves as a stark reminder of the limitations of current AI in the medical field. The man sought guidance from ChatGPT on replacing salt in his diet. The AI recommended sodium bromide, a substance that was used in some medications in the early 20th century but is now considered toxic in large quantities. Ignoring warnings, the man bought sodium bromide and consumed it for three months. The resulting health issues included alarming symptoms such as severe fear, mental confusion, and excessive thirst. The doctors intervened to stabilize his condition, and he was eventually released after undergoing treatment to correct the imbalance caused by the sodium bromide poisoning. The case underscores the importance of consulting medical professionals for health-related decisions.
