rf-fullcolor.png

 

November 4, 2025
by Ferdous Al-Faruque

Experts: AI tools can reduce the clinical evaluation workload for medtech firms

ROTTERDAM, NETHERLANDS — Artificial intelligence (AI) tools can be useful when conducting clinical evaluations during the medical device development process, but they have significant limitations that manufacturers should be aware of, according to experts who spoke at the RAPS European Lifecycle Management conference last week.
 
While such tools can save researchers a lot of time and resources, developers ultimately need to ensure the work is verified by humans, the experts agreed.
 
During the meeting, Andrew Gibson, senior managing consultant at Akra Team, and Carole Robin, head of the clinical department at Eurofins E&E France, discussed best practices for using AI in clinical evaluations. While there are several AI tools on the market with varying costs and capabilities, they emphasized that there is no single solution for all manufacturers.
 
They said that generative AI tools such as Elicit, Claude, and ChatGPT can significantly reduce workload and time commitment by human staff by automating literature searches, screening, and data extraction.
 
Similarly, they said that tailored AI tools such as Flinn, DistillerSR, and CAPTIS can allow researchers to conduct multi-database searches, deduplicate articles, and conduct data extraction. When used effectively, these tools can lead to efficiencies and traceability, which could save researchers between 60-80% of their time, they said. They also noted that tailored AI can have reduced chances of hallucinating, which an important consideration for healthcare product developers.
 
"It helps to generate or refine search strategy, it helps [to screen] the data extraction, it helps to remove duplicates, [and] it helps also to generate summary of clinical interaction in terms of limitation," Robin said.
 
However, she warned that the technology has the potential for hallucination and needs human verification. “For generative AI, I would say that it can significantly reduce the workload," she said. "However, it doesn't reduce the need for verification, but shifts the effort”
 
Comparing generative AI tools to tailored AI tools, the experts noted that tailored AI may be better for clinical writing that is medtech focused, more consistent and has collaboration capabilities. They added that the tools may also have more data security and be less prone to hallucinations. On the other hand, they may need more human involvement to operate and could cost more than generative AI tools.
 
According to Gibson, several factors need to be considered when deciding whether to use AI or which AI to use, including the manufacturer’s size, growth stage, number of devices and risk classes of the devices in their portfolio, their internal policies and ability to adopt the technology, and potential compliance or legal risks.
 
Before choosing what AI to use, the panelists said that it’s important to know the potential strengths and limitations of the technologies as well. They noted that the AI tools can be user-friendly, free or relatively cheap, conduct fast and accurate translations, help refine clinical questions, extract data, screen and summarize clinical literature. However, they can also have drawbacks and limitations, such as hallucinations, training bias, and inconsistent results. Furthermore, the are limited by the methods the researchers use and require human verification.
 
“This is not a one click solution, a fully automated solution, and we don't really want that anyway, because we do want to keep human in loop,” said Gibson. “It kind of defeats the process of clinical evaluation, in my opinion, if it's a one button clinical evaluation generation.
 
“That might be offered in the future, I'm not sure,” he added. “But I would like to not have it fully automated.”
 
Gibson said that not all AI solutions are equivalent and when selecting a tool, sponsors should ensure they understand the basis of the AI and what it is doing.
 
“In in all these solutions, I think it's important to keep human in the loop, and we need qualified and competent resources internally to be able to assess the output of the AI in any of these cases,” said Gibson.
 
He added that it is important that humans oversee protocol generation, conduct queries, identify the right literature, decide inclusion/exclusion criteria, evaluate appraisal suggestions, and draft clinical evaluation report sections.
 
“We always want to be able to review and be confident in our ability to review the information that the AI is generating,” said Gibson. “We need to be able to speak to it clinically, from a regulatory perspective, otherwise, the review is thrown off.”
 
Finally, he added that the clinical evaluator needs to be able to sign off on the work done by the AI.
 
“In Al, the cost benefit analysis is company specific, and there's no one answer,” said Gibson. “When comparing the conventional approaches to the benefits of generative AI or specific AI tools, we can see that the traditional, conventional approaches are time consuming and might be improved with the implementation of the AI solutions, but they don't replace human expertise.
 
“Not applying AI tools to the clinical evaluation is missing out on a very powerful tool to optimize the process and to accelerate that clinical evaluation,” he added. “But the researcher should remain as accountable for the final decision of the results of the clinical evaluation and needs to be in the loop and be aware of exactly all the decisions that are being made regarding the inclusion of publications and the extraction of data or just being evaluated.”
 
Gibson said that the AI tools are improving all the time and expects to see significant improvement in the coming months that will lead to major change in the way product developers write their clinical evaluation reports.
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.