rf-fullcolor.png

 

October 7, 2025
by Ferdous Al-Faruque

Experts: AI can enhance clinical trial participant selection

Editor's note: This article was updated to correct the presentation of the name of one of the companies mentioned.

SAN DIEGO — Industry experts say that using artificial intelligence (AI) for medical device clinical trials can have financial, logistical, and ethical benefits, arguing that the technology can help researchers target the right trial participants and reduce the number of participants needed.
 
Daniel Hawkins, CEO of Vista AI, noted that selecting the right patient population for clinical trials is a significant hurdle, especially for startups. He said that many clinical trials have faced substantial challenges because researchers couldn’t identify patients most likely to respond to the intervention.
 
Traditionally, researchers may be thinking about using AI to analyze trial data, but Hawkins said there’s a significant opportunity to use AI at other stages, including before the trial begins.
 
“The reality is all about how you use [AI],” said Hawkins. “There are ways in the development cycle to use AI to advance the pace of an idea to the first patient.
 
“I think there’s a really interesting opportunity with AI, even if your product isn’t primarily AI,” he added.
 
Furthermore, Hawkins said that AI can be used to reduce the number of patients required to conduct the trial effectively. He said that, especially for startups, this could mean investors find the company more viable for funding. He also said that could mean an accelerated trial and the product can get to market faster.
 
Darrell Swenson, director of engineering at Medtronic, noted that using AI for clinical trials isn’t just a valuable tool for targeting potential trial participants. It may also be the right thing to do to ensure that patients aren’t unnecessarily exposed to treatments that may not work for them. He noted that Medtronic has been using AI for computational modeling, in silico studies, and to evaluate efficacy and safety.
 
“First of all, [using AI in such ways] is the right thing to do, morally,” said Swenson. “The more we can know about whether something is going to be successful before it goes into humans the more ethical it is to actually do a human trial.”
 
He further emphasized that AI is the way of the future and companies that don’t embrace the technology risk being left behind.
 
Sandra Rodriguez, senior industry analyst at Axendia, noted that FDA has embraced digital transformation and technologies such as AI and has made efforts to work more collaboratively with companies to get promising products to market.
 
“They’re trying to work as hard as they can with the industry to allow you to catch up to where they already are,” said Rodriguez. “And we always say, you don’t want to be behind your regulators on the adoption of technology, you don’t want them to know more about your product than you do.
 
“Regulators really aren’t the barrier, a lot of times its organizational inertia or regulatory inertia,” she added.
 
According to Rodriguez, while larger companies face organizational hurdles when adopting new technologies, they often have the money to buy the tools or hire people to help them make the change. She advised companies that don’t have the resources to adopt the technologies or people to partner with solution providers who can help them meet those needs.
 
Swenson spoke about how FDA and industry have evolved their thinking on using in silico models to evaluate products. He noted that the agency and companies have published research on the successful use of in silico models and how to integrate virtual patients into the models. He added that with AI's rapid growth and promise, and new statistical methods, the models have become more viable though they will not replace clinical studies.
 
“There are a lot of things coming together right now to make this perfect trifecta, this perfect timing, where this is the golden age of in silico clinical studies and a lot of that has been enabled because the barriers that AI is breaking down,” said Swenson.
 
The panelists also discussed some of the challenges to adopting AI. They noted that clinicians and nurses are often skeptical about using AI and that there is often a lack of expertise in handling it. They said there is concern about data biases that can negatively affect the AI’s performance and worries about mistakes associated with using generative AI, such as hallucinations.
 
Despite the concerns, they noted that the technology has a lot to offer the healthcare system, and regulators also recognize its potential.
 
"There's going to come a time when the agency is going to ask you, 'What have you done to select your patients using AI?... How do we know these are the right patients? What models have you used?'" said Hawkins. “I do believe we are going to get to that place because patient selection is that critical, and the agency is trying to protect not just the patients that are treated but the patients that are studied.”

×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.