rf-fullcolor.png

 

October 12, 2023
by Jeff Craven

Convergence: US and EU diverge on regulatory paths for AI/ML devices

MONTREAL – The United States and Europe are taking very different paths in their attempts to create a regulatory framework for artificial intelligence (AI) and machine learning (ML)-driven devices, a pair of panelists told attendees at RAPS Convergence 2023.
 
Monika Bhatt, chief FDA and healthcare regulatory compliance counsel at Siemens Healthineers, said the US framework for software that incorporates AI falls under the purview of clinical decision support (CDS) software, one of the health-related software exemptions in the 21st Century Cures Act. The definition for CDS is broad, Bhatt said, and can include tools like computerized alerts and reminders, clinical guidelines, condition-specific order sets, specific patient data and summaries, templates for documentation, and reference information with contextually relevant factors.
 
When it was initially published in 2017, FDA’s draft guidance on CDS software drew scrutiny from industry members that believed FDA was overstepping its bounds as a regulatory body, which led to a second draft guidance in 2019 that was more lenient than the 2017 guidance, Bhatt said (RELATED: Industry Raises Concerns with FDA Draft Guidance on Clinical Decision Support Software, Regulatory Focus 15 March 2018).
 
In the final guidance, FDA made significant changes from the draft and noted it would regulate CDS software similarly to software as a medical device. The CDS Coalition recently petitioned the FDA to withdraw the guidance, reiterating its concerns about FDA overstepping its authority (RELATED: Industry group petitions FDA to withdraw CDS guidance, Regulatory Focus 09 February 2023).
 
CDS software meets the exemption criteria for not being a device if it does not use a medical image or signal from a device to acquire, process, or analyze; if it is intended to display, analyze, or print medical information; if is intended to offer recommendations about the prevention, diagnosis, or treatment of a disease to a health care professional (HCP); and if it is intended for a HCP to independently review the basis for the recommendations that the software presents.
 
However, these exemptions introduce several challenges for software developers, Bhatt explained. For instance, “most software developers do not know what kind of conversations regularly happen between health care providers, and what kind of conversation is normal between healthcare provider and a patient,” she said. It also limits the type of data a CDS can use, she added.
 
FDA has also said CDS software used in an emergent setting is not meeting its criteria for an exempted device, Bhatt explained. Additionally, the criterion concerning providing recommendations to HCPs is plural, and might pose a challenge to CDS software with optimal singular recommendations.
 
There is another problem in how CDS software presents its labeling, which requires a “plain language description” of functions and the algorithm. This may present issues for developers who need to decide if they can provide that level of disclosure relative to intellectual property concerns, Bhatt noted.
 
“This is a heavy burden,” she said. “Those who are already marketing without this additional labeling, what should they be doing?”
 
If CDS software developers are currently following FDA’s 2017 or 2019 draft guidance, Bhatt recommended they reassess their software to see if it meets FDA’s new criteria for a non-device CDS or device CDS. Labeling should also be evaluated to determine whether it meets the new criteria as well as the software’s inputs and outputs. Other considerations, such as whether the CDS is used in an emergent setting, if it supports recommendations for patients and caregivers, and if it has a risk probability score, should be evaluated, Bhatt said.
 
EU’s Artificial Intelligence Act
 
Kenneth Fuh, product assessment expert at TÜV SÜD, said that in the EU, AI/ML devices fall into five main buckets based on requirements set out by provisions in the EU’s Medical Device Regulation (MDR) and In Vitro Diagnostic Medical Devices Regulation (IVDR). These include device description, AI model, AI lifecycle, AI risk management and AI postmarket surveillance.
 
“[T]he goal here is to demonstrate compliance as well as to meet the intent of the of the regulations,” he said.
 
Fuh said the EU AI Act is currently in the process of being finalized, with negotiations between Member States, Parliament and the European Commission. However, Fuh said the Act is expected to become law at the end of 2023 or by early 2024.
 
A key aspect of the EU AI Act is that the definition of AI is neutral, while remaining open to possible new technologies. The definition of AI is “very, very broad,” and is now described as “a machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions, that influence [the] physical or virtual environment,” he said.
 
Under the EU AI Act, AI systems are classified into levels of minimal or no risk, AI with transparency obligations, high risk, as well as unacceptable risk categories. Medical devices are automatically classified as high risk and are subject to the requirements of that risk level, which include an ex-ante conformity assessment. “There’s no way you can try to justify to Notified Bodies that you should be a low risk or minimal -- forget about that argument,” Fuh said.
 
Taken together, developers of software as a medical device not only have to meet requirements under MDR and IVDR in addition to IEC 62304, they are also expected to classify and adhere to EU AI Act risk levels. “Your risk management processes have to capture these different classifications of risk,” he said.
 
Fuh said he recommends manufacturers considering AI/ML technology in their medical device, and looking at the EU marketplace, should be proactive in being responsible about their AI efforts. “We expect a knock on effect with other global regulators to follow the EU's lead in terms of stringency and how they evaluate AI/ML enabled medical devices,” he said.
 
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.