rf-fullcolor.png

 

March 12, 2024
by Mary Ellen Schneider

UK seeks to curb bias in medical devices

The UK government is taking steps to limit potential bias in the performance of medical devices, including removing racial bias from datasets used in clinical trials and improving the transparency of data used in medical devices that include artificial intelligence (AI).  
 
The policy actions come in response to an independent review of equity in medical devices commissioned by the UK Department of Health and Social Care. The 18 recommendations from the independent panel focused on areas thought to be especially vulnerable to potential bias, including optical devices, such as pulse oximeters, polygenic risk scores (PRS) in genomics and AI-enabled medical devices.
 
“The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups,” wrote Margaret Whitehead, chair of the review and professor of public health at the University of Liverpool. “Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.”
 
Optical devices
 
In looking at pulse oximeters, the panel found “extensive evidence of poorer performance” when used for patients with darker skin tones and pointed to studies in the US that linked that performance to delayed recognition of disease and delayed treatment. However, the panel did not find similar evidence to show that bias in pulse oximeters impacted care in the UK. (RELATED: FDA advisors want standards, labeling to address racial disparities with pulse oximeters, Regulatory Focus 03 November 2022)
 
“We did not find any evidence from studies in the NHS of this differential performance affecting care but the potential for harm is clearly present,” the panel wrote.
 
Other optical devices reviewed include near-infrared spectroscopy (NIRS), transcutaneous bilirubinometers and dermoscopes. “Evidence is mixed but is suggestive of a degree of bias in such optical devices,” the panel wrote.
 
The panel made several recommendations for improving optical devices, including that:
  • The Medicines and Healthcare products Regulatory Agency (MHRA) should update its guidance to patients/caregivers and developers/manufacturers to ensure existing pulse oximeters are used safely and equitably.
  • MHRA and approved bodies should strength standards for approval of new pulse oximeters, including requiring data to show accuracy in darker skin tones.
  • Manufacturers should design smarter oximeters that address measurement bias.
  • UK professional bodies should perform an equity audit of commonly used optical devices.
  • Researchers and other stakeholders should increase skin tone diversity in medical imaging databanks used to test optical devices.
  • Regulators should monitor and audit optical devices post-approval in real-world conditions.
 
AI-enabled devices
 
The panel warned that the acceptance of AI-enabled device technology as routine “could obscure their potential to generate or exacerbate ethnic and socio-economic inequities in health.”
 
Bias in AI-enabled devices can occur from the way that health problems are prioritized for device development, how the data are selected when developing and testing devices, how the outcomes are defined, how the underlying AI algorithms are developed and tested, and how the device’s impacts are monitored in real-world use, according to the report. 
 
“The emerging evidence points to a critical need for patients and clinicians to contribute to better articulation and prioritisation of the health ‘problems’ (for the device to solve), and for better AI and health equity literacy that will ultimately help us focus on the best data and outcomes that should count most in possible solutions to these biases,” the panel wrote.
 
The panel emphasized a “whole-system approach” in its recommendations, including that:
  • AI-enabled device developers should engage with diverse groups of patients to allow a co-design process that includes equity, fairness and transparency.
  • The government should create an academy to improve understanding of equity in AI-enabled medical devices.
  • AI device developers should be transparent about diversity and completeness of data.
  • MHRA should adjust its risk assessment of AI-enabled devices so that everything except the “simplest and lowest risk technologies” are categorized as Class IIa or higher.
  • NHS England should update the digital technology assessment criteria (DTAC) used by health and social teams in buying digital tech to include equity as part of the pre-purchase validate checks.
  • The government should convene an expert panel to assess and monitor the potential impact of large language and foundation models on AI quality and equity.
 
Polygenic risk scores in genomics
 
The panel also reviewed devices in genomics that use PRS that assess risk of diseases. While currently in use in direct-to-consumer tests, these devices have yet to be approved by the NHS. The major equity concerns cited by the panel included that lack of diversity in the genetic datasets used by PRS and broader societal issues like the potential for PRS information to be misinterpreted by the public.
 
“There is also the more immediate challenge for the NHS of dealing with patients’ concerns from PRS tests that are coming into the UK through commercial, direct-to-consumer routes without any regulation or support for the people who receive this sort of information,” the panel wrote.
 
Since there are already initiatives underway to address the lack of diversity in genetic databases, the panel focused its recommendations on the societal issues. The recommendations include:
  • Widening the focus of PRS studies beyond genetic diversity to include the contribution of social determinants of health on overall disease risk.
  • Funding research to improve knowledge and understanding around PRS.
  • Developing guidance for healthcare professionals on equity challenges of applying PRS testing in patient care and population health programs.
 
Government response
 
The UK government “fully accepted” the report’s conclusions and committed to making several policy changes.
 
Actions already being taken include MHRA requesting that approval applications for new medical devices describe how they will address bias and the National Institute for Health and Care Research accepting funding applications for research into smarter oximeters. Moving forward, the UK government said it will work to remove racial bias in datasets and ensure diverse skin tones are included in the data used in clinical trials. Additionally, the government pledged to work with partners to improve the transparency of data used in AI-enabled medical devices and other AI products that influence clinical decisions.
 
“Although considerable work is already being undertaken by multiple stakeholders and across different strands of work, we cannot stop here,” the UK government wrote in its response to the report. “Work to resolve and prevent health disparities is and will continue to be an ongoing priority across government and the health system as new technologies and issues emerge.”
 
In its response to the report, MHRA highlighted its plans to strengthen guidance for developers and manufacturers to improve diversity in testing and development of medical devices, working to ensure device regulation is “fit for purpose” by considering the diversity of users of AI and other software as a medical device or diagnostic, and continuing to strengthen its vigilance role by engaging with patients and the public.
 
Equity in Medical Devices: Independent Review
 
MHRA Response
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.