Some medical devices commonly used in the UK’s National Health Service (NHS) are less effective on darker skin tones and feature bias towards white populations, according to an independent review by the government’s Department for Health and Social Care.

The review was commissioned following NHS concerns throughout the Covid-19 pandemic that pulse oximeters may not be as accurate for patients with darker skin tones than for those with lighter skin.

Titled ‘Equity in Medical Devices: Independent Review’. the final report calls for regulators, developers, manufacturers, and healthcare professionals to take immediate action to ensure existing pulse oximeter devices in the NHS can be used safely and equitably for all patient groups across different skin tones.

At the same time, the report highlighted how the use of AI in healthcare is equally liable to show bias, often overcorrecting in the wrong direction such as the application of race correction factors based on erroneous assumptions of racial or ethnic differences or attempts to devise fairness metrics for AI devices that misrepresent patient populations.

In her foreword, lead author and chair in Public Health at the University of Liverpool, Margaret Whitehead detailed how the report is calling for the Medicines and Healthcare Regulatory Agency (MRHA) to draft guidance for patients and healthcare providers on how some pulse oximeters may not be appropriate for patients with certain skin tones.

It is also calling for a government-appointed task force on large language models (LLM) used by AI systems to assess the health equity impact of AI along with the proper resourcing to take on the problem.

Access the most comprehensive Company Profiles
on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free
sample

Your download email will arrive shortly

We are confident about the
unique
quality of our Company Profiles. However, we want you to make the most
beneficial
decision for your business, so we offer a free sample that you can download by
submitting the below form

By GlobalData

Whitehead said: “Few outside the health system may appreciate the extent to which AI has become incorporated into every aspect of healthcare, from prevention and screening to diagnostics and clinical decision-making, such as when to increase the intensity of care. Our review reveals how existing biases and discrimination in society can unwittingly be incorporated at every stage of the lifecycle of the devices and then magnified in algorithm development and machine learning.

“The evidence for adverse clinical impacts of these biases is currently patchy, though indicative. Seven of our recommendations are therefore focused on actions to enable the development of bias-free AI devices, with the voices of the public and patients incorporated throughout.”

Among the recommendations aimed at tackling inaccuracy among pulse oximeters include health professionals advising patients who have been provided with a pulse oximeter for use at home to look at changes in readings rather than just a single reading, to identify when oxygen levels are going down and when they need to call for assistance.

It is also calling for clinical guideline developers and health technology assessment agencies such as the National Institute for Health and Care Excellence (Nice) to produce guidance on the use of pulse oximeters emphasising the variable nature of the devices regarding varying skin tones.

It is also calling for the MHRA and approved bodies for medical devices to strengthen the standards for approval of new pulse oximeter devices to include sufficient clinical data to demonstrate accuracy overall and in groups with darker skin tones.

In a bid to address concerns with AI, the company is also calling on the government to commission an online and offline academy to improve the understanding among all stakeholders of equity in AI-assisted medical devices. It also calls on researchers, developers and those deploying AI devices should ensure they are transparent about the diversity, completeness, and accuracy of data through all stages of research and development.

The report follows after a poll, conducted by medical diagnostic company Skin Analytics found that 84% of the British public was comfortable having their skin imaged by AI devices if it meant getting appointments more frequently. Previously in the US, another study by medical technology manufacturer Dermasensor demonstrated its skin cancer detection device can reliably detect cancers despite differences in melanin content and skin tone.


Source link