rf-fullcolor.png

 

October 17, 2024
by Jeff Craven

FDA officials outline need for oversight of AI in healthcare, biomedicine

US Food and Drug Administration (FDA) officials cited a need for safe, effective, and trustworthy tools in the growing space of artificial intelligence (AI), which they say involves both oversight from the agency and collaboration from stakeholders to adapt to AI’s unique challenges.
 
“Historic advances in AI applied to biomedicine and health care must be matched by continuous complementary efforts to better understand how AI performs in the settings in which it is deployed,” Haider Warraich, and colleagues at FDA explained, writing in a special communication published in JAMA. “This will entail a comprehensive approach reaching far beyond the FDA, spanning the consumer and health care ecosystems to keep pace with accelerating technical progress.”
 
FDA officials outlined the need for better understanding and focus in areas associated with AI, including global regulation, keeping pace with AI changes, flexible approaches, use in medical product development, preparing for unknowns, life cycle management, responsibilities of regulated industries, robust supply chains, incorporating ideas from startups and academia, and a shift to prioritizing health outcomes over financial returns.
 
Without these efforts, AI runs the risk of not meeting expectations “similar to other general-purpose technologies deployed in health care settings or even create significant harm if untended models’ performance deteriorates or focuses on financial return without adequate attention to impact on clinical outcomes,” they said.
 
Since FDA’s first approval of a device that used AI in 1995, there have been more than 1,000 AI-based medical devices approved by the agency, and there has been a ten-fold increase in the number of submissions for AI-based devices since 2020, the authors said. To that end, the agency has been discussing AI more frequently, releasing a controversial final guidance document in 2022 on clinical decision support (CDS) software that included AI-based applications, and publishing a paper in March 2024 on how FDA and its centers are aligning its regulatory approaches for the use of AI in medical products. (RELATED: Industry group petitions FDA to withdraw CDS guidance, Regulatory Focus 9 February 2023)
 
Regulation of AI in the US needs to be aligned with global standards, and the authors noted that FDA is part of an International Medical Device Regulators Forum working group focused on AI as well as working groups in the International Council for Harmonisation. A flexible approach is needed for AI in medical devices, and “risk-based regulatory approaches will need careful consideration and adaptation,” the authors said, as different AI models in healthcare may be present in devices where FDA has different degrees of involvement with regard to regulation. FDA and stakeholders will also need to keep up with the change of pace in AI, which involves a “adaptive, science-based regulatory scheme” the agency said they are supporting through its focus on total product life cycle approach for medical devices and initiatives like the Software Precertification Pilot Program.
 
“The sheer volume of these changes and their impact also suggests the need for industry and other external stakeholders to ramp up assessment and quality management of AI across the larger ecosystem beyond the remit of the FDA,” they wrote.
 
Concerning AI in medical product development, the authors noted the agency “sees great potential in the application of AI in drug development and clinical research,” in the premarket setting, postmarket surveillance, and evaluation. “By analyzing vast amounts of real-world data, AI systems can detect patterns and anomalies that may improve the ability to find potential safety issues, unexpected benefits, or performance inefficiencies,” the authors explained. “Such a proactive approach may enable quicker identification of adverse events, leading to timelier interventions and corrections. Additionally, AI can synthesize analysis of clinical trials, postmarket surveillance, and patient feedback, providing a comprehensive overview of a product’s life cycle across areas that have previously been separate.”
 
One unique challenge in the AI space is the potential unknowns surrounding generative AI like large language models (LLMs), which will require oversight from individuals and organizations in addition to regulatory scrutiny and likely necessitate specialized tools for evaluating generative AI outputs. Performance of AI should also be evaluated “in the environment in which it is being used,” the authors said, and may need an “information ecosystem much like that monitoring a patient in the intensive care unit” in health systems that adopt such AI tools.
 
“The evolution of AI illustrates a major quality and regulatory dilemma,” the authors explained. “Since the safety and effectiveness of many AI models depends on recurrent evaluation of their operating characteristics, the scale of effort needed could be beyond any current regulatory scheme.” While traditional medical products are largely the same regardless of distribution, the same cannot be said of AI-enabled medical products, they noted.
 
Another consideration for AI-based medical products is its potential use in mitigating shortages through anticipation or quick response in the areas of generic drugs or low-cost devices. However, AI models may be prone to outages and shortages, and reducing these models’ vulnerability to technology outages “must be integral to these technologies,” the authors said.
 
“Among numerous challenges will be the daunting task of determining ways for all developers, including small entities, to ensure that AI models are safe and effective across the total product life cycle in diverse settings,” they wrote. This includes balancing the wants and needs of newcomers and smaller players like startups, entrepreneurs, and academic institutions who might come to the field with fresh ideas relative to the needs of large technology companies with the “capital, computational resources, and expertise needed for their development.”
 
FDA also recognizes the tension between optimizing financial returns with AI technology and choosing to prioritize health outcomes. “Many AI innovations that could benefit patients may come at the price of traditional jobs, capital structures, and revenue streams in health care,” they explained. “Yet too many US residents live in health care deserts, with primary care shortages even in many physician-dense areas, and AI algorithms could point to more preventive services that currently are not profitable.” The authors noted that an “intentional focus on health outcomes” will be needed to overcome the downsides of optimizing for financial returns, and that “responsible collective advancement” will require a broad collaboration beyond the purview of the agency.
 
“Strong oversight by the FDA and other agencies aims to protect the long-term success of regulated products by maintaining a high grade of public trust in the regulated space. It is in the interest of the biomedical, digital, and health care industries to identify and deal with irresponsible actors and to avoid misleading hyperbole,” the authors said. “Regulated industries, academia, and the FDA will need to develop and optimize the tools needed to assess the ongoing safety and effectiveness of AI in health care and biomedicine. The FDA will continue to play a central role with a focus on health outcomes, but all involved sectors will need to attend to AI with the care and rigor this potentially transformative technology merits.”
 
JAMA Warraich et al.
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.