New AI skin cancer diagnosis system proved to be less effective on dark skin

By Monica Athnasious

Published Nov 11, 2021 at 09:30 AM

Reading time: 3 minutes

Everyday, we learn more about how AI is being used in new and innovative ways to combat serious issues. One department that can’t seem to get enough of the technology is the medical world. From being used to diagnose early signs of dementia and helping paralysed people write with their minds to something as ridiculously funny as smart toilets protecting your health, AI is set to be the future of healthcare. Perfect, right? Well, not quite.

Much like most infrastructures and systems, AI is no stranger to the systemic racism that permeates through almost all institutions of our current society. While there are examples of its use to actually combat racism—take Sajid Javid’s drive to utilise AI in tackling racial inequality throughout the NHS—the system is not without its faults. In fact, its use in recent years has shown a historic pattern of discrimination, from Facebook’s (now Meta) AI labelling black men as ‘primates’ to failing to distinguish different people of colour from each other, the tech has a long way to go. In fact, its projected saturation in society has led to real fears that AI could soon ‘hack’ our biometric data.

Now, another supposed breakthrough in AI medicine comes with its obvious catch; new research has surfaced that details how AI systems being trained to diagnose skin cancer is far less effective on darker skin.

The new AI recognition technology works by training the systems to develop machine learning algorithms that can identify and diagnose specific types of skin cancer, The Guardian reports. This method of diagnosis reportedly matches the success rate of diagnosis from human medical professionals. However, while this could revolutionise this sector of healthcare, it would also represent a risk of certain groups and demographics being left behind.

Researchers are arguing for significant strides to be made which would assure all patients that they can benefit from these technological advancements. Like much of the medical world, the skin image databases that could be used as part of the AI’s ‘training’ to identify skin cancer contain very few, if any, examples of darker skin. This leaves out vital information needed for the diagnosis pertaining to different ethnic groups and skin types. In simpler terms, the AI’s effectiveness in diagnosis depends largely on the quality of the data it is trained with. This important research, and its findings, were first presented at the National Cancer Research Institute (NCRI) Festival—a national cancer conference—and published in Lancet Digital Health.

Presented by Doctor David Wen from the University of Oxford, the first author of the study, Wen stated, “AI programs hold a lot of potential for diagnosing skin cancer because it can look at pictures and quickly and cost-effectively evaluate any worrying spots on the skin. However, it’s important to know about the images and patients used to develop programs, as these influence which groups of people the programs will be most effective for in real-life settings.” Wen went on to highlight how this would breed an exclusionary medical method in the field.

“You could have a situation where the regulatory authorities say that because this algorithm has only been trained on images in fair-skinned people, you’re only allowed to use it for fair-skinned individuals, and therefore that could lead to certain populations being excluded from algorithms that are approved for clinical use,” he explained. And even if you were allowed to use it, it most likely wouldn’t be effective, “Alternatively, if the regulators are a bit more relaxed and say: ‘OK, you can use it [on all patients]’, the algorithms may not perform as accurately on populations who don’t have that many images involved in training.”

This could bring a swathe of complications such as failing to spot treatable cancers and accurately assessing the risks of surgery, the researchers explained. Their comprehensive study, published in Lancet Digital Health, details how the scientists utilised a “combined MEDLINE, Google and Google Dataset search” that found 21 open-access skin cancer databases containing over 100,000 images. The percentage of which that includes ethnic groups is disheartening.

The researchers recorded that only a few of these 21 datasets noted the ethnic group or skin type to which the individuals photographed belong. Among the 106,950 images, only 2,436 had a skin type detailed along the image. Of these 2,436, the total number of images that recorded the individual as having brown skin was ten-with only one being listed as having dark brown or black skin.

“We found that for the majority of datasets, lots of important information about the images and patients in these datasets wasn’t reported. There was limited information on who, how and why the images were taken. […] This can potentially lead to the exclusion or even harm of these groups from AI technologies,” Wen added.

Such failings could quite literally be a matter of life or death for people of colour, Wen further explained, “Although skin cancer is rarer in people with darker skins, there is evidence that those who do develop it may have worse disease or be more likely to die of the disease. One factor contributing to this could be the result of skin cancer being diagnosed too late.” In order to combat this, the team of researchers hope to use their data to fuel the creation of health data quality standards to be implemented in the future and continuous development of AI in medical care. Much like the drive Javid proposed, the standards will include specific requirements in the representation of certain patients and cite the individuals that must be recorded.

Hopefully, this step becomes one of many that sets out to equalise medical health information and better aid the lives of everyone.

Keep On Reading

By Abby Amoakuh

Being delulu at work: A gen Z cop-out or a legitimate self-sabotage coping mechanism?

By Abby Amoakuh

Grand Theft Auto 6 leak reveals game’s first female protagonist and a glimpse into franchise’s future

By Charlie Sawyer

Who is Tommy Robinson, the far-right anti-Islam activist who was arrested at London’s anti-Semitism march?

By Abby Amoakuh

Shoplifting addiction is at an all-time high. And white middle-class women are to blame

By Abby Amoakuh

Netizens are comparing the Israel-Hamas war to the Hunger Games franchise. Here’s why it doesn’t work

By Jack Ramage

What is a gymcel? And why is the term problematic?

By Fatou Ferraro Mboup

Oat milk vs almond milk: the ultimate showdown

By Charlie Sawyer

JoJo Siwa fans shocked to discover performer’s mother started bleaching her hair when she was 2 years old

By Fatou Ferraro Mboup

Man who attacked Las Vegas judge in viral video charged with her attempted murder

By Abby Amoakuh

Drake calls for release of Tory Lanez, proving once more that he’s a rapper for the manosphere

By Charlie Sawyer

From being besties with Eminem to birthing the royal baby, here’s things you didn’t know about Trisha Paytas

By Charlie Sawyer

Gun safety expert warns how crucial Gen Z’s vote will be in 2024 US presidential election

By Fatou Ferraro Mboup

From breast cancer survivors to greenwashing: Kim Kardashian’s SKIMS nipple bra heats up the internet

By Abby Amoakuh

Inside Mark Zuckerberg’s secret Hawaiian apocalypse bunker and the doomsday conspiracy behind it

By Fatou Ferraro Mboup

Who was the goblin who crashed the 2024 Emmy Awards red carpet?

By Abby Amoakuh

Fans boycott Stranger Things ahead of season 5 release amid Noah Schnapp controversy

By Abby Amoakuh

Selena Gomez haters use singer’s comments on Israel-Hamas war to reignite Hailey Bieber feud

By Fatou Ferraro Mboup

Female students fear harassment after all-male committee form pro-life society in Manchester

By Abby Amoakuh

TikTok comedian Matt Rife’s issue with his female fanbase is misogyny at its finest

By Charlie Sawyer

Miley Cyrus fans convinced that her bodyguard was hiding something shocking at Grammys 2024