Regulating Generative AI: Insights From the FDA’s First DHAC Meeting
The FDA’s new Digital Health Advisory Committee (DHAC) convened for the first time on November 20-21, 2024. The topic of discussion was medical devices that rely on generative AI (GenAI) models, which hold transformative potential for healthcare but pose unique regulatory challenges. For example, how to define the device’s intended use, manage errors like hallucinations (outputs that are not based on real data) and data drift (performance degradation due to changes in input data over time), and address the evolving nature of the underlying AI models and understand how they function.
Some examples of GenAI-enabled medical devices include:
- AI-Enhanced Imaging: Improves precision and speed in CT/MRI image reconstruction.
- AI-Driven Radiology Assistance: Detects disease and abnormalities based on imaging.
- AI-Assisted Diagnosis: Assists with diagnosis of disease based on the results of diagnostic tests, lab values, and clinical signs or symptoms.
- AI Healthcare Chatbots: Answers patients’ healthcare-related questions from a clinician’s perspective.
Since GenAI-enabled devices vary in their uses, and their models are updated very quickly, the FDA assembled this committee to understand how to classify GenAI-enabled medical devices based on risk, to decide what evidence is needed to show they are safe and effective, and to oversee these devices both before and after they reach the market.
In this effort, the FDA has asked stakeholders at this meeting to provide feedback on several topics/questions to refine regulatory frameworks for these devices:
- Premarket Evaluation
- What evidence and performance metrics are critical for evaluating the safety and effectiveness of GenAI-enabled devices?
- How to address factors like human involvement or the distinction between recalling information versus generating new information?
- Risk Management
- What new opportunities and risks has GenAI created for medical devices, and what controls (eg, governance, training, feedback mechanisms) are needed to manage these risks effectively?
- Post Market Monitoring
- What specific monitoring strategies are needed to ensure GenAI-enabled devices remain accurate, reliable, and safe as they adapt to new data and user interactions over time?
While this was a marathon 2-day session packed with too much information to include in this post, stakeholder input centered around the idea that regulating GenAI-enabled medical devices will be an incremental process requiring continuous updates and that GenAI-enabled medical devices will require clear guidelines and proactive monitoring to enable responsible growth and use. Here are just a few areas of specific stakeholder feedback
- Equity and Bias Concerns: Stakeholders recommended establishing frameworks to mitigate bias in GenAI systems and track the impact of bias over time, as well as certification programs to ensure developers proactively address these risks.
- Post-Market Monitoring Repositories: Stakeholders suggested creating a centralized data repository to enable systematic error tracking and performance evaluation of GenAI across populations and settings.
- Clarifying Standardized Metrics: Stakeholders suggested developing standardized definitions and metrics for GenAI-specific risks, such as hallucinations and data drift.
- Synthetic Control Trials: Stakeholders proposed alternative evaluation methods, like synthetic control trials, to test the comparative efficacy of GenAI-enabled devices in clinical settings.
The insights from this first DHAC meeting highlight the need for a lifecycle approach to regulating GenAI-enabled devices, balancing innovation with safety and equity. Building on the Total Product Lifecycle (TPLC) framework discussed in the FDA’s executive summary, the regulatory strategy must continuously evolve to address GenAI’s unique risks and opportunities. The goal is not just to manage the risks but to harness the transformative potential of GenAI to align healthcare delivery more closely with evidence-based gold standards.
Aaron Csicseri, PharmD
Senior Scientific Director
Dr. Csicseri joined the ProEd team in November 2017 as a scientific director, responsible for scientific leadership, content development, strategic input, and effective moderation of team meetings. Aaron has extensive experience guiding Sponsor teams through the AdCom preparation process. He received his PharmD at the University of Buffalo, where he studied the clinical curriculum. Aaron has 10+ years of experience as a medical director/clinical strategist in the accredited medical education field (CME), as well as in the non-accredited PromoEd sphere. Over the past 8 years, he has been supporting Sponsors in their preparations for FDA and EMA regulatory meetings in a wide variety of therapeutic areas. Aaron is based in Grand Island, NY, just outside Buffalo.