New AI skin cancer diagnosis system proved to be less effective on dark skin

By Monica Athnasious

Published Nov 11, 2021 at 09:30 AM

Reading time: 3 minutes

24401

Everyday, we learn more about how AI is being used in new and innovative ways to combat serious issues. One department that can’t seem to get enough of the technology is the medical world. From being used to diagnose early signs of dementia and helping paralysed people write with their minds to something as ridiculously funny as smart toilets protecting your health, AI is set to be the future of healthcare. Perfect, right? Well, not quite.

Much like most infrastructures and systems, AI is no stranger to the systemic racism that permeates through almost all institutions of our current society. While there are examples of its use to actually combat racism—take Sajid Javid’s drive to utilise AI in tackling racial inequality throughout the NHS—the system is not without its faults. In fact, its use in recent years has shown a historic pattern of discrimination, from Facebook’s (now Meta) AI labelling black men as ‘primates’ to failing to distinguish different people of colour from each other, the tech has a long way to go. In fact, its projected saturation in society has led to real fears that AI could soon ‘hack’ our biometric data.

Now, another supposed breakthrough in AI medicine comes with its obvious catch; new research has surfaced that details how AI systems being trained to diagnose skin cancer is far less effective on darker skin.

The new AI recognition technology works by training the systems to develop machine learning algorithms that can identify and diagnose specific types of skin cancer, The Guardian reports. This method of diagnosis reportedly matches the success rate of diagnosis from human medical professionals. However, while this could revolutionise this sector of healthcare, it would also represent a risk of certain groups and demographics being left behind.

Researchers are arguing for significant strides to be made which would assure all patients that they can benefit from these technological advancements. Like much of the medical world, the skin image databases that could be used as part of the AI’s ‘training’ to identify skin cancer contain very few, if any, examples of darker skin. This leaves out vital information needed for the diagnosis pertaining to different ethnic groups and skin types. In simpler terms, the AI’s effectiveness in diagnosis depends largely on the quality of the data it is trained with. This important research, and its findings, were first presented at the National Cancer Research Institute (NCRI) Festival—a national cancer conference—and published in Lancet Digital Health.

Presented by Doctor David Wen from the University of Oxford, the first author of the study, Wen stated, “AI programs hold a lot of potential for diagnosing skin cancer because it can look at pictures and quickly and cost-effectively evaluate any worrying spots on the skin. However, it’s important to know about the images and patients used to develop programs, as these influence which groups of people the programs will be most effective for in real-life settings.” Wen went on to highlight how this would breed an exclusionary medical method in the field.

“You could have a situation where the regulatory authorities say that because this algorithm has only been trained on images in fair-skinned people, you’re only allowed to use it for fair-skinned individuals, and therefore that could lead to certain populations being excluded from algorithms that are approved for clinical use,” he explained. And even if you were allowed to use it, it most likely wouldn’t be effective, “Alternatively, if the regulators are a bit more relaxed and say: ‘OK, you can use it [on all patients]’, the algorithms may not perform as accurately on populations who don’t have that many images involved in training.”

This could bring a swathe of complications such as failing to spot treatable cancers and accurately assessing the risks of surgery, the researchers explained. Their comprehensive study, published in Lancet Digital Health, details how the scientists utilised a “combined MEDLINE, Google and Google Dataset search” that found 21 open-access skin cancer databases containing over 100,000 images. The percentage of which that includes ethnic groups is disheartening.

The researchers recorded that only a few of these 21 datasets noted the ethnic group or skin type to which the individuals photographed belong. Among the 106,950 images, only 2,436 had a skin type detailed along the image. Of these 2,436, the total number of images that recorded the individual as having brown skin was ten-with only one being listed as having dark brown or black skin.

“We found that for the majority of datasets, lots of important information about the images and patients in these datasets wasn’t reported. There was limited information on who, how and why the images were taken. […] This can potentially lead to the exclusion or even harm of these groups from AI technologies,” Wen added.

Such failings could quite literally be a matter of life or death for people of colour, Wen further explained, “Although skin cancer is rarer in people with darker skins, there is evidence that those who do develop it may have worse disease or be more likely to die of the disease. One factor contributing to this could be the result of skin cancer being diagnosed too late.” In order to combat this, the team of researchers hope to use their data to fuel the creation of health data quality standards to be implemented in the future and continuous development of AI in medical care. Much like the drive Javid proposed, the standards will include specific requirements in the representation of certain patients and cite the individuals that must be recorded.

Hopefully, this step becomes one of many that sets out to equalise medical health information and better aid the lives of everyone.

Keep On Reading

By Eliza Frost

Hailey Bieber just listed all the beauty treatments she swears by

By Charlie Sawyer

Harry Potter TV series crew bewildered over production’s strange decision on location to film iconic scene

By Eliza Frost

How Jet2holidays and Jess Glynne became the sound of the summer

By Eliza Frost

How to spot a performative male out in the wild 

By Eliza Frost

UK to lower voting age to 16 by next election. A controversial move, but the right one

By Eliza Frost

Does the SKIMS Face Wrap actually work, or is it just another TikTok trap?

By Charlie Sawyer

How influencer Liv Schmidt promotes toxic eating habits through the Skinni Société 

By Eliza Frost

Kylie Jenner now follows Timothée Chalamet on Instagram, but he doesn’t follow her back

By Eliza Frost

Bad Bunny is not touring the US due to fear of ICE raids at concerts

By Charlie Sawyer

Yung Filly’s legal troubles mount as the rapper faces two new sexual assault charges in Australia

By Eliza Frost

Zayn Malik’s new song suggests One Direction era wasn’t all sunshine and rainbows

By Charlie Sawyer

From breaking up families to spreading rumours about Joe Biden’s death, here’s what QAnons been up to

By Eliza Frost

It now takes 20 hours of work a week to survive as a UK university student

By Eliza Frost

Why is Taylor not Team Conrad in The Summer I Turned Pretty?

By Charlie Sawyer

Brooklyn Beckham and Nicola Peltz Beckham hire a lawyer to battle misinformation amid growing family rift

By Eliza Frost

How fans manifested Elle Fanning as Effie Trinket in The Hunger Games: Sunrise on the Reaping

By Eliza Frost

If everyone has an AI boyfriend, what does that mean for the future of Gen Z dating?

By Charlie Sawyer

Johnny Depp plays the victim once more and anoints himself crash test dummy for #MeToo

By Eliza Frost

Are you in Group 7? Explaining the latest viral TikTok trend

By Charlie Sawyer

Introducing Berlin’s latest tourist attraction Cybrothel, where men can request AI sex dolls covered in blood