rf-fullcolor.png

 

May 14, 2025
by Ferdous Al-Faruque

Euro Convergence: Experts discuss AI Act’s future and impact on medtech

BRUSSELS – Medtech stakeholders said they are facing challenges hiring software and artificial intelligence (AI) specialists who can help them navigate the recently passed European AI Act in the context of the Medical Device Regulation (MDR) and In Vitro Diagnostics Regulation (IVDR).
 
A panel of experts discussed the AI Act and how it fits the European medtech landscape at RAPS Euro Convergence 2025. They also addressed the new challenges they face and expressed hope for new guidance and standards to offer greater certainty in the months to come.
 
Nada Alkhayat, a policy officer at the Directorate-General for Health and Food Safety (DG SANTE) at the European Commission, said the Commission is hoping to provide as much clarity as possible on the AI Act and how it interplays with the MDR and IVDR. The Commission plans to publish a frequently asked questions (FAQ) document with about 40 questions on the topic by the end of the summer, but she emphasized the deadline is not set in stone. She also noted that the Commission plans to release a general-purpose AI obligations guidance and national competent authorities are being asked to designate notifying authorities and market surveillance authorities under the AI Act by 2 August.
 
Alkhayat noted that the Commission and other stakeholders are advocating for those authorities to be the same ones that handle MDR and IVDR oversight. The selection of these authorities is expected to trigger a series of activities, including the designation of notified bodies under the AI Act.
 
Marco Caproni, global director for software product assessment at TÜV SÜD, said that the notified bodies’ deadlines for the AI Act were aligned with the Commission's. He said they are waiting for the announcement regarding the notifying and market surveillance authorities because they are the main counterparts they will be working with. They have been preparing for the deadlines listed by creating internal processes, training staff, and building up competence within their assessment team.
 
Sarah Mathew, artificial intelligence regulatory lead at BSI, said that her company is also working toward meeting the AI Act deadlines. She said that since March 2021, they have been building capacity, competence, and capability, which includes recruiting AI experts to become technical reviewers and improving their ability to conduct algorithm audits.
 
Koen Cobbaert, senior manager for regulatory science and policy at Philips, who moderated the panel, noted that Team-NB has said that only 20 of its 44 members are interested in getting AI Act designation, which is concerning for medtech companies who want to work with the same notified body.
 
"It shows that we are probably going to go for more consolidation of the field of notified bodies but also shows that there is hesitation among a lot of the notified bodies to get the designation," he said.
 
Mathew said that while BSI advises on how to interpret parts of the AI Act with MDR, there isn't much information on the pathway for accreditation for notification. She noted that the first challenge for them is that there isn't a lot of guidance yet on how the AI Act will be implemented, and the second challenge is recruiting competent personnel to their notified body.
 
“Everyone is struggling to get talent for data scientists, machine learning experts, they're very expensive and in the US they're even more expensive,” said Mathew. “We have to compete with other companies to get this talent and unfortunately the costs are going to go up because they're quite in demand.”
 
Caproni echoed the sentiment and said TÜV SÜD has been hiring software and AI specialists and bringing them up to speed on medical devices. He also added that the lack of AI standardization has created significant risks to the industry.
 
Cobbaert noted that DG CONNECT has voiced optimism of the standards that are currently being developed in the EU. He said the first two standards, one on quality management systems and another on risk management, have been put up on the ballot. He also noted there are some international standards, but they have yet to prove that they can support the AI Act. He added that there has been a lot of criticism of the international standards for being inadequate.
 
"There's a rightful challenge from the Commission that [the international standards] may not actually bring the trust that we need," said Cobbaert. "It's a bit too early to now say what these draft standards will actually bring in terms of trust but I'm hopeful."
 
"On data management, there are quite solid standards out there," he added. "Ironically, they come from China with the Chinese medical device authorities using them.”
 
Alkhayat noted that at the Commission level, when the AI Act was being drafted, a unit within the agency began working with standardization committees and are working on more than a dozen standards on topics such as data governance and human oversight.
 
"We recognize that standards take time... Hopefully, these standards will help provide that necessary clarity in order to provide training to develop the right expertise and to perform the correct conformity assessments," she said. "Things are in the pipeline. Rome wasn't built in a day. So, we're getting there."
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.