Feb 12, 2026
"Dr. ChatGPT" as a substitute for doctors?
The ZEIT publishing group I ZEIT ONLINE has featured a topic that is becoming increasingly important both at ZMI and at ScaDS.AI Dresden/Leipzig: self-diagnosis and therapy decisions with the help of AI and large language models (LLMs) such as ChatGPT.
Our research associate Dr. Markus Wolfien gave an extensive interview to ZEIT ONLINE and reported on the phenomenon of consulting AI models for medical questions, from symptom analysis to supposed diagnosis. He addresses the possibilities and limitations of large language models, dos and don'ts when prompting, and provides practical advice on...He addresses the possibilities and limitations of large language models, dos and don'ts when prompting, and provides practical tips for working with AI.
Our Head of Data Science & AI Research highlights real pitfalls:
- Overtrust: Models convey certainty, even when they are wrong
- Hallucinations / erroneous conclusions by the model are difficult for laypeople to identify
- "Dr Google 2.0" effect: Suddenly, false self-diagnoses or dramatic differential diagnoses seem plausible
- Data security: Disclosure of sensitive documents (PDFs, findings, doctor's letters)
Conclusion: AI can provide support, but it cannot replace doctors. Unreflective, uncritical use can lead to misdiagnoses and security risks for personal data.
➡️ Read the full article: https://lnkd.in/dZJg3Rtq?
➡️ To the LinkedIn posts:
- https://www.linkedin.com/posts/markus-wolfien_digitalhealth-ki-llm-share-7427315485546790913-jJQA?utm_source=share&utm_medium=member_desktop&rcm=ACoAAC_Q12YBoCc9PILuHR2JqsE_b84P9e-DlxQ
- https://www.linkedin.com/posts/dr-claudia-heine-2b27b11a4_selbstdiagnose-mit-ki-ist-dr-chatgpt-ein-activity-7427392111114858496-_sk7?utm_source=share&utm_medium=member_desktop&rcm=ACoAAC_Q12YBoCc9PILuHR2JqsE_b84P9e-DlxQ