Podcast Episode
In a case highlighted by NPR on January thirtieth, New York consultant Bethany Crystal noticed red spots on her legs and turned to ChatGPT for guidance. The AI urgently recommended she seek emergency care, telling her she needed immediate evaluation for possible bleeding risk. Crystal was subsequently diagnosed with immune thrombocytopenic purpura, a rare autoimmune disorder that can cause dangerously low platelets. She believes she might not have reached the emergency room in time without the chatbot's insistence.
Robert Wachter, chair of the Department of Medicine at the University of California San Francisco, described witnessing AI recommend the anti-parasitic medication ivermectin for testicular cancer. In another case published in the Annals of Internal Medicine, a sixty-year-old man was hospitalised for three weeks with paranoia and hallucinations after allegedly misinterpreting ChatGPT's advice and replacing table salt with toxic sodium bromide.
AI Doctors: When ChatGPT Gets It Right and When It Gets It Dangerously Wrong
January 30, 2026
Audio archived. Episodes older than 60 days are removed to save server storage. Story details remain below.
Over two hundred and thirty million people now ask ChatGPT health questions weekly, with some crediting the AI chatbot for lifesaving diagnoses. But safety experts have named AI chatbot misuse as the top health technology hazard of twenty twenty-six, documenting cases of incorrect diagnoses and dangerous advice.
The Rise of AI Health Consultations
A growing number of patients are crediting ChatGPT with identifying serious medical conditions that their doctors initially missed. More than two hundred and thirty million people worldwide now ask the AI chatbot health-related questions each week, according to OpenAI.In a case highlighted by NPR on January thirtieth, New York consultant Bethany Crystal noticed red spots on her legs and turned to ChatGPT for guidance. The AI urgently recommended she seek emergency care, telling her she needed immediate evaluation for possible bleeding risk. Crystal was subsequently diagnosed with immune thrombocytopenic purpura, a rare autoimmune disorder that can cause dangerously low platelets. She believes she might not have reached the emergency room in time without the chatbot's insistence.
ChatGPT Health Launches
OpenAI launched ChatGPT Health on January seventh, a dedicated platform allowing users to securely connect their medical records and wellness apps including Apple Health, MyFitnessPal, and Peloton. The company developed the tool in collaboration with more than two hundred and sixty physicians across sixty countries. OpenAI emphasises that conversations within ChatGPT Health will not be used to train its AI models and that the feature is intended to support rather than replace medical care.Growing Safety Concerns
Despite success stories, the nonprofit patient safety organisation ECRI has named AI chatbot misuse as the number one health technology hazard for twenty twenty-six. The organisation documented cases where chatbots suggested incorrect diagnoses, recommended unnecessary testing, and even invented nonexistent anatomy whilst sounding authoritative.Robert Wachter, chair of the Department of Medicine at the University of California San Francisco, described witnessing AI recommend the anti-parasitic medication ivermectin for testicular cancer. In another case published in the Annals of Internal Medicine, a sixty-year-old man was hospitalised for three weeks with paranoia and hallucinations after allegedly misinterpreting ChatGPT's advice and replacing table salt with toxic sodium bromide.
A Divided Medical Community
Healthcare professionals remain split on the technology. Patient advocates note that unlike time-pressed physicians, AI has unlimited time to engage in exhaustive inquiry and can help identify rare conditions. However, Professor Zhang Wenhong, director of China's National Center for Infectious Diseases, warned that doctors who bypass clinical training to rely on AI will be unable to judge whether AI diagnoses are correct.Published January 30, 2026 at 10:14pm