Industry groups urge FDA to use existing controls to monitor AI-assisted devices
Industry groups and companies in the medical device sector are urging the US Food and Drug Administration (FDA) to utilize its existing quality management systems (QMS) requirements to oversee the performance of artificial intelligence (AI)-assisted medical devices. They recommend against creating a new oversight framework for this purpose.
These calls are in response to a request for comments on a discussion paper published by the FDA’s Center for Devices and Radiological Health (CDRH) last fall. FDA requested input on how to assess best practices, methodologies, and approaches for measuring and evaluating the real-world performance of AI-enabled medical devices. This includes strategies for identifying and addressing performance drift, which refers to detecting changes in both inputs and outputs. (RELATED: FDA seeks input on evaluating AI-enabled medical devices, Regulatory Focus 6 October 2025)
The paper highlighted the need for ongoing and systematic performance monitoring to ensure the safe and effective use of AI by observing how these systems operate during clinical deployment.
According to comments from the American Hospital Association (AHA), there are over 1,240 AI-enabled medical devices that have been approved by FDA to date, with most of these approvals occurring within the past three years.
Groups say existing postmarket controls are sufficient
Medical device industry groups urged the FDA to utilize existing controls for conducting post-market monitoring and evaluation of AI-enabled devices under the quality management system regulation (QMSR).
The Advanced Medical Technology Association (AdvaMed) said that “FDA’s request for public comment on real-world performance monitoring of AI-enabled medical devices raises critical questions about how to ensure safety and effectiveness while avoiding unnecessary regulatory burden. The appropriate answer is not to build new postmarket frameworks uniquely for AI, but rather to leverage the robust quality management systems (QMS) and regulatory structures already in place.”
The Medical Device Manufacturers Association (MDMA) concurred. The group wrote that “all medical devices are already subject to general controls associated with post-market monitoring and measurement obligations under the Quality Management Systems Regulation (QMSR) (transition from QMS in 2026). These general controls include monitoring and measuring feedback (ISO 13485:2016 clause 8.2.1), complaint handling (clause 8.2.2), control of nonconforming product (section 8.3), and improvement, including through the use of postmarket surveillance (section 8.5), among others.”
The Olympus Corporation of the Americas also weighed in and advocated for the use of the existing quality management system to monitor these devices. Olympus is a medical technology company specializing in endoscopy, visualization, and minimally invasive therapies.
“Olympus believes FDA’s focus, as outlined in its Request, on real-world performance monitoring, data drift, and practical evaluation methodologies is both timely and essential and should be implemented by leveraging existing quality management systems (QMS) and regulatory structures, rather than creating parallel frameworks. Current QMS and risk-management requirements under and the forthcoming QMSR aligned with ISO 13485 already provide a strong foundation for rigorous, scalable, and adaptable monitoring across AI technologies…Manufacturers already collect and assess real-world data, including device logs and complaint information, as part of existing surveillance programs.”
The Health Innovation Alliance (HIA), which represents various groups including patient advocates and health care providers, among other stakeholders, also urged the FDA to utilize its existing tools to oversee AI-enabled devices.
“FDA's existing risk-based framework for medical devices has proven effective and should remain central to AI-enabled device regulation. Oversight should be proportionate to risk, mitigated by factors such as human involvement, reliance on technology, and data transparency. This approach ensures appropriate patient protections while avoiding undue burden. Different risk levels warrant different controls – from mandatory post-market studies for high-risk devices to lighter oversight for lower-risk administrative applications.”
AHA encourages FDA to update adverse event reporting
In its comments, AHA urged FDA to update its adverse event reporting system to reflect the unique features of AI-enabled devices and counter the risk of bias, hallucinations and model drift.
“The FDA should update adverse reporting mechanisms to provide a more nuanced approach for the unique factors that impact the model integrity of AI-enabled medical devices. While manufacturers are required to report adverse events through the Manufacturer and User Facility Device Experience tool as part of the Medical Device Reporting program, the existing reporting variables do not include details on AI-specific risks. For example, the reporting tool simply has categories for malfunctions, injury or death,” AHA wrote.
The top observations identified in Form 483 reports from inspections conducted under the recently implemented Quality Management System Regulation (QMSR) include risk management, outsourcing and purchasing, and complaint handling and feedback.
The European Parliament and Council have reached a provisional agreement on the Critical Medicines Act (CMA), moving proposals designed to tackle shortages of key medicines closer to law.