New AI skin cancer diagnosis system proved to be less effective on dark skin

By Monica Athnasious

Published Nov 11, 2021 at 09:30 AM

Reading time: 3 minutes

Everyday, we learn more about how AI is being used in new and innovative ways to combat serious issues. One department that can’t seem to get enough of the technology is the medical world. From being used to diagnose early signs of dementia and helping paralysed people write with their minds to something as ridiculously funny as smart toilets protecting your health, AI is set to be the future of healthcare. Perfect, right? Well, not quite.

Much like most infrastructures and systems, AI is no stranger to the systemic racism that permeates through almost all institutions of our current society. While there are examples of its use to actually combat racism—take Sajid Javid’s drive to utilise AI in tackling racial inequality throughout the NHS—the system is not without its faults. In fact, its use in recent years has shown a historic pattern of discrimination, from Facebook’s (now Meta) AI labelling black men as ‘primates’ to failing to distinguish different people of colour from each other, the tech has a long way to go. In fact, its projected saturation in society has led to real fears that AI could soon ‘hack’ our biometric data.

Now, another supposed breakthrough in AI medicine comes with its obvious catch; new research has surfaced that details how AI systems being trained to diagnose skin cancer is far less effective on darker skin.

The new AI recognition technology works by training the systems to develop machine learning algorithms that can identify and diagnose specific types of skin cancer, The Guardian reports. This method of diagnosis reportedly matches the success rate of diagnosis from human medical professionals. However, while this could revolutionise this sector of healthcare, it would also represent a risk of certain groups and demographics being left behind.

Researchers are arguing for significant strides to be made which would assure all patients that they can benefit from these technological advancements. Like much of the medical world, the skin image databases that could be used as part of the AI’s ‘training’ to identify skin cancer contain very few, if any, examples of darker skin. This leaves out vital information needed for the diagnosis pertaining to different ethnic groups and skin types. In simpler terms, the AI’s effectiveness in diagnosis depends largely on the quality of the data it is trained with. This important research, and its findings, were first presented at the National Cancer Research Institute (NCRI) Festival—a national cancer conference—and published in Lancet Digital Health.

Presented by Doctor David Wen from the University of Oxford, the first author of the study, Wen stated, “AI programs hold a lot of potential for diagnosing skin cancer because it can look at pictures and quickly and cost-effectively evaluate any worrying spots on the skin. However, it’s important to know about the images and patients used to develop programs, as these influence which groups of people the programs will be most effective for in real-life settings.” Wen went on to highlight how this would breed an exclusionary medical method in the field.

“You could have a situation where the regulatory authorities say that because this algorithm has only been trained on images in fair-skinned people, you’re only allowed to use it for fair-skinned individuals, and therefore that could lead to certain populations being excluded from algorithms that are approved for clinical use,” he explained. And even if you were allowed to use it, it most likely wouldn’t be effective, “Alternatively, if the regulators are a bit more relaxed and say: ‘OK, you can use it [on all patients]’, the algorithms may not perform as accurately on populations who don’t have that many images involved in training.”

This could bring a swathe of complications such as failing to spot treatable cancers and accurately assessing the risks of surgery, the researchers explained. Their comprehensive study, published in Lancet Digital Health, details how the scientists utilised a “combined MEDLINE, Google and Google Dataset search” that found 21 open-access skin cancer databases containing over 100,000 images. The percentage of which that includes ethnic groups is disheartening.

The researchers recorded that only a few of these 21 datasets noted the ethnic group or skin type to which the individuals photographed belong. Among the 106,950 images, only 2,436 had a skin type detailed along the image. Of these 2,436, the total number of images that recorded the individual as having brown skin was ten-with only one being listed as having dark brown or black skin.

“We found that for the majority of datasets, lots of important information about the images and patients in these datasets wasn’t reported. There was limited information on who, how and why the images were taken. […] This can potentially lead to the exclusion or even harm of these groups from AI technologies,” Wen added.

Such failings could quite literally be a matter of life or death for people of colour, Wen further explained, “Although skin cancer is rarer in people with darker skins, there is evidence that those who do develop it may have worse disease or be more likely to die of the disease. One factor contributing to this could be the result of skin cancer being diagnosed too late.” In order to combat this, the team of researchers hope to use their data to fuel the creation of health data quality standards to be implemented in the future and continuous development of AI in medical care. Much like the drive Javid proposed, the standards will include specific requirements in the representation of certain patients and cite the individuals that must be recorded.

Hopefully, this step becomes one of many that sets out to equalise medical health information and better aid the lives of everyone.

Keep On Reading

By Fatou Ferraro Mboup

What is the viral red nail theory and does it actually work?

By Charlie Sawyer

Deepfake videos of Taylor Swift and Selena Gomez used in elaborate Le Creuset online scam

By Emma O'Regan-Reidy

Is BookTok ruining reading? Critics seem to think so

By Alma Fabiani

Biden’s impeachment inquiry explained and how abortion will impact the 2024 US elections

By Abby Amoakuh

Who is Selena Gomez dating? From Justin Bieber to Benny Blanco, here’s her full dating history

By Abby Amoakuh

Everything you need to know about David Cameron’s ridiculous meeting with Donald Trump at Mar-A-Lago

By Abby Amoakuh

Sydney Sweeney claps back at TikTok scammer who pretended to be her dietitian

By Charlie Sawyer

Topicals brand trip goes viral after Nella Rose claims influencers were subjected to racism and Islamophobia

By Charlie Sawyer

From payday budgeting to savings account strategy, here’s how to become a finance baddie

By Alma Fabiani

Watch terrifying moment waterslide explodes into huge fireball at theme park

By Abby Amoakuh

Pro-suicide website finally blocked by broadband providers after being linked to 50 deaths in the UK

By Fleurine Tideman

Travis Kelce gave both Taylor Swift and the whole world the ick

By Fatou Ferraro Mboup

NHS starts testing weight loss pill with gastric balloon inside for the first time

By Abby Amoakuh

New Alabama bill to add rape exception to abortion ban and punish rapists with castration

By Abby Amoakuh

Neuralink’s human implant success sparks fear for the future of society

By Abby Amoakuh

After School Satan Club causes uproar in US elementary school

By Abby Amoakuh

Jenna Ortega exits Scream franchise following firing of Melissa Barrera over Palestine comments

By Fatou Ferraro Mboup

$18K alpha male boot camp promises to turn weak men into modern-day knights

By Emma O'Regan-Reidy

Why are Gen Zers putting bows on everything? Explaining the coquette ribbon obsession

By Fatou Ferraro Mboup

Former Cloudflare employee Brittany Pietsch goes viral after filming brutal firing process