This site is intended for health professionals only


Immediate action needed to tackle bias in medical devices including pulse oximeters

Immediate action needed to tackle bias in medical devices including pulse oximeters

Immediate action is needed to address biases built into medical devices that impact ethnic minorities and increase health inequalities, an independent review has concluded.

In a detailed look at pulse oximeters, use of artificial intelligence (AI) in healthcare and genomic risk scores, the review found biases introduced across the life-cycle of development of devices and products.

It found ‘extensive evidence’ of poorer performance of pulse oximeters in patients with darker skin tones where the devices over-estimate true oxygen levels and warned that ‘immediate mitigation measures’ were needed in the NHS to ensure existing pulse oximeters perform to a high standard for all patient groups.

With rapid advances in recent years, AI has become incorporated into every aspect of healthcare from prevention and screening to diagnosis and clinical decision making, the review said.

Yet existing biases and injustices in society can be ‘unwittingly’ introduced at every stage before being magnified in algorithm and machine learning.

Built-in bias can impact ethnic minority groups, women and those from deprived communities, it noted.

The review also warned of issues with polygenic risk scores which are not yet used by the NHS but available as commercial products marketed as assessing an individuals’ risk of disease but which have a ‘well-established bias’ against groups with non-European ancestry.

In response, the Government said it accepted the findings of the report in full and announced a series of steps it would be taking to tackling the problem.

It includes commitments to ensuring that pulse oximeter devices can be used safely in the NHS across a range of skin tones, and removing racial bias from data sets used in clinical studies.

Professor Dame Margaret Whitehead, professor of public health at the University of Liverpool and chair of the review said: ‘The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.’  

‘Our recommendations therefore call for system-wide action, requiring full government support. The UK would take the lead internationally if it incorporated equity in AI-enabled medical devices into its global AI safety initiatives.’

Andrew Stephenson, minister of state in the Department of Health said: ‘Ministers agree that unless appropriate action is taken, ethnic and other unfair biases can occur throughout the medical device life cycle, from research, development and testing, to approval, deployment and post-market monitoring, as well as in the use of devices once deployed.’

He added they would work with the Medicines and Healthcare products Regulatory Agency (MHRA) to ensure regulations for medical devices are safe for patients, regardless of their background, while allowing more innovative products to be placed on the UK market.

Dr Sara Khalid, associate professor of health informatics and biomedical data science, at the University of Oxford, said: ‘This important review reinforces what experts have long-known about the role of health data poverty and resulting biased AI in worsening underlying systemic health inequities.

‘It will be important to monitor if and how these practical recommendations influence real clinical practice.’


          

Visit Pulse Reference for details on 140 symptoms, including easily searchable symptoms and categories, offering you a free platform to check symptoms and receive potential diagnoses during consultations.

READERS' COMMENTS [1]

Please note, only GPs are permitted to add comments to articles

John Graham Munro 21 March, 2024 6:52 pm

x