rf-fullcolor.png

 

February 14, 2025
by Jeff Craven

Gottlieb: FDA CDS AI classification ‘at odds’ with 21st Century Cures

The US Food and Drug Administration’s (FDA) current thinking on classifying some artificial intelligence (AI) tools integrated in software like an electronic medical record (EMR) as medical devices “could be at odds with the original intent of laws that were designed to regulate digital health tools based on their clinical use,” former FDA commissioner Scott Gottlieb argued in a recent JAMA Health Forum article.
 
Gottlieb said FDA’s final guidance on clinical decision support software (CDSS) “added new uncertainties” when it classified AI tools as medical devices regardless of intended use, particularly when integrated into an EMR.
 
“The current regulatory posture of the FDA for classifying tools with these advanced analytical capabilities as medical devices could impede or even block their integration into EMR systems,” he said. “This policy could encumber one of the most high-impact applications of these tools—the ability to embed them alongside a patient’s health record, where they are able to synthesize multiple complex data streams and generate novel clinical insights.”
 
The final guidance has proven controversial since it was published in September 2022, garnering a petition for its removal by the Clinical Decision Support Coalition, and legal arguments that the guidance goes against the statutory language laid out in the 21st Century Cures Act. (RELATED: Industry group petitions FDA to withdraw CDS guidance, Regulatory Focus 09 February 2023; Legal expert: FDA’s CDS software guidance is a ‘disaster’ for industry, Regulatory Focus 29 September 2022)
 
Gottlieb said the final guidance outlines situations such as automation bias and time criticality where “clinicians may not have the opportunity or impetus to apply their own judgment to the recommendations from the CDSS,” and software that integrates data from multiple sources, and notes that these AI-enabled tools would be classified as medical devices.
 
“Based on these considerations, it is possible that any AI functionality that is integrated into an EMR could fall outside the initial exemption and render the new tool, and indeed the entire EMR, a medical device subject to premarket review,” he said.
 
Gottlieb said that Cures contains “clear criteria” for when digital health tools would be considered medical devices, but the final guidance “inadvertently imposed a ceiling” on AI functionality by defining specific capabilities that would warrant a premarket review.
 
“A solution lies in returning to the intent of the 21st Century Cures Act and the policies advanced from 2017 to 2019. The intent was to regulate CDSS based on how the data analysis is presented to health care clinicians instead of focusing on how clinicians would use the information to inform their judgment,” he said. “If these AI tools are designed to augment the information available to clinicians and do not provide autonomous diagnoses or treatment decisions, they should not be subjected to premarket review.”
 
When it comes to AI-enabled tools in EMRs, Gottlieb explained that FDA could evaluate how the tools are designed and validated before they come to market and verify whether the tools enhance medical decision-making in a postmarket setting using real-world evidence.
 
“Artificial intelligence has an inherent ability to synthesize complex information streams and deliver enhanced analyses or recommendations that might otherwise evade notice. That aptitude alone should not classify them as devices,” Gottlieb said.
 
In an interview, Aaron Kesselheim, professor of medicine at Harvard Medical School and director of the Program On Regulation, Therapeutics, And Law (PORTAL) group at Brigham and Women’s Hospital and Harvard Medical School, who was not involved with the article, said more oversight of AI-enabled tools is needed at the agency. FDA officials also recently called for AI oversight in a special communication published in JAMA. (RELATED: FDA officials outline need for oversight of AI in healthcare, biomedicine, Regulatory Focus 17 October 2024)
 
“I think we need a system of FDA oversight of all AI-enabled tools used in health care delivery given the lack of knowledge we have about how well AI interacts with health care decision-making and the potential that the involvement of AI could cause substantial harm to patients,” Kesselheim told Focus. “I would think that the AI industry would welcome fair and transparent regulatory oversight that minimizes exemptions because it would help strengthen physician and patient confidence in using AI-enabled products in knowing that there was reliable data supporting their use that the FDA had reviewed.”
 
Focus also contacted the Office of the Center Director in the Center for Devices and Radiological Health at FDA for comment on Gottlieb’s article; in response, a communications officer said FDA did not have any information to share.
 
JAMA Health Forum Gottlieb
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.