Image Credits: Curto News/BingAI

UK Study Finds Discriminatory Bias in Healthcare AI Tools

Ethnic minorities, women and people from underserved communities are at risk of receiving poorer healthcare due to discrimination within AI tools and medical devices, a report has revealed.

Among other discoveries, the Medical Device Equity Report: Independent Review raised concerns about devices that use artificial intelligence (AI), as well as those that measure oxygen levels. The team behind the review said urgent action is needed.

ADVERTISING

Professor Frank Kee, director of the Center for Public Health at Queen's University Belfast and co-author of the review, said: “We would like to take an equity perspective across the entire lifecycle of medical devices, from initial testing to recruitment. from patients, whether in the hospital or in the community, to early phase studies and implementation in the field after being licensed.”

The government-commissioned review was launched in 2022 after concerns were raised about the accuracy of pulse oximeter readings in people from black and minority ethnic backgrounds.

The report confirmed concerns about pulse oximeters overestimating the amount of oxygen in the blood of people with dark skin, noting that while there is no evidence this affects care in the UK public health sector, harm has been found in the US – with such prejudices leading to delayed diagnoses and treatments, as well as worsening organ function and death in black patients.

ADVERTISING

Team members emphasize that they are not asking you to avoid the devices. Instead, the review presents a series of measures to improve the use of pulse oximeters in people of different skin tones, including the need to observe changes in readings rather than single readings, while also providing advice on how to develop and test new devices to ensure they work well for patients of all ethnicities.

Concerns with devices based on artificial intelligence were also highlighted by the report, including the potential for this technology to exacerbate underdiagnosis of heart conditions in women, lead to discrimination based on patients' socioeconomic status, and result in underdiagnosis of skin cancers in people with darker skin tones. Concerns about the latter, they say, are due to the fact that AI devices are largely trained on images of lighter skin tones.

Read also

Scroll up