<- Back to Insights
January 5, 2024
Building Better Guardrails for Algorithmic Medicine – Duke AI Health
Summary
AI tools in healthcare, offering diagnoses, risk predictions, and decision support, are increasing yet lack regulation and oversight. Concerns include black box systems not performing as expected, possibly compromising quality and safety, and promoting inequities. Design decisions and training data affect an AI system's performance, making validation and ongoing assessment vital. Duke University has set up an oversight system for AI tools, and is part of the Coalition for Health AI (CHAI), working on creating guidelines for health AI systems to ensure trust, fairness, and transparency. The guidelines aim to monitor and assess the performance of AI systems throughout their lifecycle.
Explore Related Topics