rf-fullcolor.png

 

March 25, 2025
by Jeff Craven

Experts: Commonality exists in international AI regulation, but collaboration requires new approaches

Most countries agree on the principles of using and regulating artificial intelligence (AI) in healthcare, but few efforts in cross-national regulation of AI have emerged thus far, according to a recent policy corner article published in The New England Journal of Medicine.
 
The authors of the paper said that much of the technical work remains, and recent uncertainty around the United States’ AI and regulatory policies have made cross-collaboration on regulating AI in healthcare between the US and other countries less likely.
 
“In general, cross-national regulation of AI also involves challenging requirements to coordinate between governments, industry, and international organizations within and across countries. Ultimately, any effort must manage the tension between perceived national interests and the value of international collaboration,” Saira Ghafur, of the Institute of Global Health Innovation at Imperial College London, and colleagues wrote in their paper.
 
In the US, former President Biden issued an executive order in late October 2023 that directed the Secretary of Health and Human Services (HHS) to take a pro-regulatory approach to AI in healthcare and in other federal agencies. However, President Trump rescinded this executive order during his second term in office, leaving the current fate of AI healthcare regulation uncertain. (RELATED: Experts: Trump executive orders create foundation for what’s to come, Regulatory Focus 24 January 2025)
 
Some countries, like the US, United Kingdom, Canada, and Australia, have chosen to regulate pre-generative AI as software as a medical device (SaMD) but have not adopted any laws regulating generative AI in healthcare. Japan has employed a regulatory approach it calls “agile governance” that assumes frameworks for regulating technology will be updated continuously and involve conversations with multiple stakeholders.
 
In the European Union (EU), generative AI would be regulated under the EU AI Act, which would classify all AI applications in healthcare as high risk and proposes legislation for AI that encompasses all the EU member states. The authors noted that “much detail about the law’s implementation remains unclear, and whether the balance it achieves between government oversight and the pace of innovation will be appealing elsewhere remains to be seen.”
 
Cross-national regulation of AI in healthcare
 
AI has been marked as a “topic of concern” among international organizations such as the Economic Cooperation and Development, G7, United Nations, and World Health Organization, but there have been “few formal commitments” on collaborating across nations to identify priorities and create shared language around the issue of AI in healthcare, the authors said.
 
Still, most industrialized democracies identify regulation of AI in healthcare as necessary, they noted, and that there should be different regulatory pathways for pre-generative AI and generative AI. While pre-generative AI could likely be regulated through existing pathways that evaluated software as a medical device, regulating generative AI would require a new approach, the authors said. “Ultimately, the political will for such cross-national collaboration may originate among the clinicians and patients who stand to benefit from the free flow of information and technological advancements enabled by AI,” they wrote.
 
David Blumenthal, corresponding author and president of the Commonwealth Fund, told Focus in an interview he thinks uncertainty about US AI policy or a decision to not regulate AI in healthcare “to any meaningful extent” would hinder collaboration with other countries in harmonizing global AI policies, and other countries may end up setting international standards independent of the US.
 
“The question is whether the Trump administration will feel strongly enough about these international regulations to use tariffs or some other form of coercion to seek exemptions for US AI products from regulations in other countries,” he said. “I doubt this would be high on the Trump administration’s list, but it is a possibility given recent experience.”
 
Editor’s note: This research was supported by grants from the Commonwealth Fund. Blumenthal is the current president of the Commonwealth Fund.
 
N Eng J Med Ghafur et al.
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.