Researchers from the US National Science Foundation and the Taiwan Ministry of Science and Technology showed that medical AI systems could quickly learn to recognize racial identity in medical images. However, this ability is tough to isolate or stop.

Studies in medical imaging have shown that AI isn't always good at figuring out a person's race. However, there is no known link between race and medical imaging that experts would notice when looking at the images. Therefore, the researchers wanted to thoroughly test how well AI could figure out a patient's race from medical images.

Challenges in identifying the race

Miseducation of algorithms is a big problem. When artificial intelligence reflects the unconscious thoughts, racism, and biases of the people who made it, it can cause harm. For example, some computer programs have wrongly marked black defendants as twice as likely to commit another crime as white defendants. When an AI used cost as a proxy for health needs, it gave the wrong impression that Black patients were healthier than white patients who were just as sick but spent less money on their care. Even AI used harmful stereotypes to cast actors in a play it wrote.

Many examples of bias in natural language processing, but MIT scientists have looked at medical images, an important but largely unexplored modality. Using both private and public datasets, the team found that AI can accurately predict the self-reported race of patients from medical images alone. Using data from chest X-rays, limb X-rays, chest CT scans, and mammograms, the team trained a deep learning model to identify race as white, Black, or Asian, even though the images didn't show the patient's race. Unfortunately, even the most experienced doctors can't do this, and it's not clear how the model managed it.

How did they tackle this problem?

Researchers looked at many different factors, like differences in anatomy, bone density, image resolution, and many others, to determine the race in chest X-rays. However, the models still had a high ability to tell race from chest X-rays. 

Image source: The Lancet Digital Health

The scientists first showed that the models could predict race across various imaging modalities, datasets, clinical tasks, academic centres, and patient populations in the United States to get ready for the tests. Next, they used three prominent sets of chest X-ray data and tested the model on both a subset of the data used to train the model and a different data collection. Next, they trained the racial identity detection models on non-chest X-ray images from other body parts and chest CTs to see if the model's performance was limited to chest X-rays.

Conclusion

Their study showed that it's easy for medical AI systems to learn to identify a person's self-reported race from medical images and that it's tough to take away this ability. The researchers found that we could quickly figure out a patient's race from medical imaging data alone. We could use this information in other places and with other imaging methods. They strongly suggest that all developers, regulators, and users of medical image analysis be cautious when using deep learning models because we could use this information to keep or even worsen the well-known racial differences in medical practice.

Their results show that future work on AI for medical imaging should focus on explicit model performance so that we can learn more information about racial identity from models.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in