Should we teach our residents to solve clinical cases using AI?
I’m sharing Chat GPT's response to my prompt below. I must say that Chat GPT seemed to read my thoughts on this question. I completely agree with all the points. Chat GPT's explanation and recommendations are much better in content and format than the ones I could have produced, so I just presented the result of our chat. I took Chat GPT 20 seconds to deliver this. Eduardo Mayorga MD
My prompt to Chat GPT
“I am a medical teacher and teach my residents to use AI to help them diagnose and treat their patients. Some of my colleagues believe that this is wrong. They say if I do this, they will not study to be able to do it without AI help. Give me your thoughts on this discussion.”
ChatGPT said:
This is a critical discussion in medical education, and your colleagues’ concerns are understandable. However, I strongly believe that teaching residents to use AI in diagnosis and treatment is not only appropriate but essential for modern medical practice—as long as it is done correctly.
Why AI in Medicine Is Beneficial
AI is Here to Stay
Whether we like it or not, AI will play a growing role in healthcare. Ignoring it would be like refusing to teach residents how to use imaging tools or electronic medical records because "they should be able to do it without them."AI as an Aid, Not a Replacement
AI should be positioned as a support tool, not a substitute for medical knowledge. The goal is not to let AI diagnose for them but to help them make better, more informed decisions.Better Learning, Not Less Learning
There is no evidence that using AI reduces learning. In fact:AI can accelerate learning by providing instant case-based feedback.
Residents can see more cases in simulations, improving their exposure.
AI can highlight differential diagnoses, training them to think more broadly.
Improved Accuracy and Reduced Bias
AI can catch things a human might miss, reducing errors and cognitive biases. Studies show that AI-human collaboration often results in higher accuracy than either alone.Teaching AI Literacy Is a Responsibility
If residents do not learn how to use AI responsibly, they will either use it incorrectly or not use it at all, falling behind the medical advancements of their peers.
How to Address Their Concerns
Emphasize Clinical Reasoning First
Residents should learn to think critically and make decisions before checking AI recommendations—just like how we teach students to read an ECG before relying on software analysis.Encourage AI as a "Second Opinion"
AI should complement human expertise, not replace it. Train residents to ask, "Does the AI’s suggestion make sense given what I know?"Evaluate AI Performance Together
Have residents challenge AI output—spotting errors and understanding limitations. This reinforces their diagnostic skills rather than replacing them.Integrate AI into Case Discussions
Use AI in structured, guided ways. For example:Present a case and ask residents for their diagnosis before consulting AI.
Compare their reasoning with AI’s and discuss discrepancies.
Highlight where AI might be misleading or incorrect.
Reassure Colleagues That AI Cannot Replace Expertise
Residents will still need to interpret findings, manage treatment plans, and communicate with patients—skills AI cannot replicate.
Final Thought
The best doctors of the future will be those who can use AI effectively while maintaining their clinical reasoning and judgment. By teaching AI literacy now, you are future-proofing your residents’ education rather than weakening it.
Would you like me to provide examples of how to integrate AI into case-based teaching?
You said: “Yes, please”
ChatGPT said:
Here are some practical ways to integrate AI into case-based teaching while ensuring that residents develop strong clinical reasoning skills. These methods will help demonstrate AI's strengths and limitations while reinforcing their diagnostic and treatment decision-making abilities.
1. Pre-Decision AI Check (Delayed AI Integration)
🔹 How It Works:
Present a clinical case without AI involvement.
Residents discuss history, exam findings, differentials, and management.
After finalizing their decision, then introduce AI suggestions.
Compare AI’s diagnosis and reasoning with theirs.
🔹 Key Learning:
Forces residents to think before AI influences them.
Teaches them to critically evaluate AI recommendations rather than accept them blindly.
Reinforces their confidence in clinical reasoning while allowing them to challenge AI.
2. AI vs. Resident Challenge
🔹 How It Works:
Provide an anonymized real patient case (or simulated case).
Ask AI for a differential diagnosis.
Residents must identify where AI is correct, where it’s wrong, and why.
🔹 Key Learning:
Strengthens ability to spot AI errors and biases.
Encourages a mindset of questioning AI outputs rather than trusting them blindly.
Enhances pattern recognition by seeing where AI struggles.
3. AI-Assisted Differential Diagnosis Expansion
🔹 How It Works:
Present a case and ask residents for a differential diagnosis.
Once they list possibilities, use AI to generate its differential.
Compare lists and discuss:
What AI included that they missed (useful additions? Irrelevant suggestions?).
What they listed that AI did not (does AI lack insight into rare cases?).
🔹 Key Learning:
AI can broaden their thinking, exposing them to diagnoses they might overlook.
Reinforces clinical prioritization—teaching them which AI-generated differentials matter and which do not.
4. AI as a “Second Opinion” in Diagnostic Reasoning
🔹 How It Works:
Have a resident go through a case alone first and form their diagnosis.
Use AI to see if it agrees or disagrees.
If there’s disagreement, have the resident explain why they are right and AI is wrong.
🔹 Key Learning:
Residents defend their reasoning, solidifying their knowledge.
Encourages healthy skepticism of AI, fostering independent thinking.
Identifies potential AI blind spots in complex cases.
5. AI-Guided Case Simulations (Structured Guidance)
🔹 How It Works:
AI generates a realistic patient case, step by step.
Residents must:
Ask history questions.
Request investigations.
Interpret test results.
Make a diagnosis and decide treatment.
AI provides feedback at each stage, helping them refine their approach.
🔹 Key Learning:
Simulates real-world decision-making while AI assists.
Helps residents see the impact of missing key questions or tests.
Enhances their ability to use AI as a clinical assistant, not a replacement.
6. AI vs. Real-World Guidelines
🔹 How It Works:
Ask AI for a treatment recommendation.
Compare AI's answer to actual clinical guidelines (e.g., AAO, RCOphth, UpToDate).
Discuss where AI aligns with or deviates from evidence-based medicine.
🔹 Key Learning:
Residents learn that AI may not always be guideline-compliant.
Reinforces critical appraisal skills—residents must justify their decisions with evidence.
Exposes weaknesses in AI’s medical training datasets.
7. AI and Ethical Dilemmas in Medicine
🔹 How It Works:
Present a complex ethical case (e.g., end-of-life decisions, patient autonomy vs. beneficence).
Ask AI for a suggested course of action.
Have residents critique its response:
Does AI consider patient values?
How does it handle legal/ethical constraints?
Would a human clinician reach a different conclusion?
🔹 Key Learning:
AI lacks human empathy and nuanced ethical reasoning.
Strengthens residents' understanding of human-centered decision-making.
Reinforces that AI is a tool, not a decision-maker.
Final Thought
These strategies will help you train clinically sharp, AI-literate doctors who:
✅ Think first, check AI later.
✅ Challenge AI when needed.
✅ Use AI as a tool, not a crutch.