How fashion brand UNLABELED’s wearable technology prevents AI from detecting humans

By Monica Athnasious

Published Oct 30, 2021 at 09:30 AM

Reading time: 3 minutes

The integration of AI in today’s society is developing so rapidly, it is in part becoming our norm. Bringing swift and dramatic changes for a supposed good, AI is being used to tackle racial inequality, diagnose medical conditions like dementia, predict suicide attempts, combat homelessness in the UK and even offer you legal advice—that’s right, an AI lawyer. However, while newer technologies have their undeniably progressive qualities, AI also reminds us of the terrifying reality of a George Orwell 1984-level monitoring as well as its quite clearly evident failings.

Such examples include its potential use in the US to spy on prison inmates, Facebook’s AI racial failure in labelling black men as ‘primates’, the economic exploitation of vocal profiling, its continued inability to distinguish between different people of colour and its worryingly uncontrollable future—to name but a few. This small list of incidents is enough for us to beg the question, should we fear AI? Well, a new fashion brand on the rise seems to think so.

UNLABELED (stylised as all-capitalised) is an incredible and exciting new artist group and textile brand that is shaping fashion’s relationship with AI. Founded in Japan—in collaboration with Dentsu Lab Tokyo—creators Makoto Amano, Hanako Hirata, Ryosuke Nakajima and Yuka Sai have developed what is described as a “camouflage against the machines.” Its specially designed garments are constructed with specific patterns that prevent any AI used in the real world from recognising you. Don’t worry, we’ll break down how this all works, but first, let’s look at why the team decided to create the brand.

The details of the creators’ project—found on the Computational Creativity Lab—divulge the reasoning behind the creation. “Surveillance capitalism is here,” it states, “Surveillance cameras are now installed outside the homes as well as in public places to monitor our activities constantly. The personal devices are recording all personal activities on the internet as data without our knowledge.” In the brand’s accompanying project documentation video found on the same page, it further details how this acquired data on us may be used, “The system transforms our everyday behaviour into data and abuses it for efficiency and pursuit of profit.”

For the creators, “the physical body is not an exception” when it comes to AI exploitation of our data. “With the development of biometric data and image recognition technology for identifying individuals, the information in real space is instantly converted into data,” they write. “Thus, our privacy is threatened all the time. In [this] situation, what does the physical body or choice of clothes mean?” From this question, UNLABELED’s fashion camouflage to evade information exploitation was born.

Showcasing a video example of their garments in action, via the Computation Creativity Lab website, a comparison is shown between two individuals—one wearing the garment and one in normal clothes. “[When] wearing [UNLABELED’s] particular garments, AI will hardly recognise the wearer as ‘human’, [while] people wearing normal clothes are easily detected,” the video narrates. And it appears to work. The camera is no longer able to recognise the person wearing UNLABELED. The brand’s name is definitely fitting, but how does it work?

How fashion brand UNLABELED’s wearable technology prevents AI from detecting humans

If you guessed AI, you’d be right. UNLABELED is fighting AI with AI. The brand’s team have developed a series of patterns—like the one showcased above—that work to confuse surveillance AI. The patterns were created by another AI model utilised by the inventive creators, “In order to fool AI, we adversarially trained another AI model to generate specific patterns inducing AI to misrecognize, then created a camouflage garment using the pattern,” they narrated in the project documentation video.

How fashion brand UNLABELED’s wearable technology prevents AI from detecting humans

This technique is described by UNLABELED as an “Adversarial Patch” or “Adversarial Examples.” This unique method involves adding specific patterns—or small noises naked to the human eye—to images or videos with the intention of inducing false recognition in the AI. When successful, the resulting patterns created from this particular approach causes AI to misrecognize shapes and objects. UNLABELED notes that this technique is currently widely used as part of research to actually improve on the shortcomings of surveillance but has in turn flipped this shortcoming to its products’ benefit and “protect our privacy.”

View this post on Instagram

A post shared by UNLABELED (@unlabeled_jp)

“Once the adversarial pattern is made, we lay them out onto the 2D-pattern. Then, the pattern [is] printed onto plain polyester blend fabric with transcription. After printing, we follow the general garment production procedure,” UNLABELED states. The brand has even developed a skateboard in the same AI-evading patterns. The products are available for purchase on the brand’s website.

While such fashion technology is not available on a wide scale, it does indicate the continuous positive shift against heavy monitoring—I mean, we all hate it when those adverts pop-up a few minutes after we’ve merely mentioned the name of a specific product. I know you’re listening to me, Apple. However, there’s two sides to every coin, even when it comes to AI. While this technology could better aid us to avoid surveillance, it could better aid anyone to avoid surveillance, if you know what I mean. For my own perspective however, it’s another sign of a generation rebelling against the norm—and I fucking love it.

Keep On Reading

By Abby Amoakuh

Micro-cheating is a millennial dating trend gen Zers aren’t worried about

By Fatou Ferraro Mboup

Problematic Christmas songs you probably shouldn’t sing anymore

By Abby Amoakuh

What is Megan’s Law and what does it have to do with Nicki Minaj and Megan Thee Stallion’s beef?

By Abby Amoakuh

Is BLACKPINK near its end? Recent contract negotiations have fans worried

By Charlie Sawyer

Home Office to pay TikTok influencers up to £5K to warn migrants not to cross the Channel

By Abby Amoakuh

Where is Melania Trump? Is the former First Lady hatching an escape plan?

By Charlie Sawyer

Period poverty has people using socks and newspapers as sanitary products amid cost of living crisis

By Fatou Ferraro Mboup

AI-generated images of Donald Trump with Black voters spread before US presidential election

By Charlie Sawyer

Florida plans to expand Ron DeSantis’ Don’t Say Gay law into workplaces and ban use of preferred pronouns

By Abby Amoakuh

Sabrina Carpenter’s music video for Feather gets priest fired from his church

By Abby Amoakuh

What does rizz mean? Learn why it’s Oxford’s Word of the Year for 2023

By J'Nae Phillips

Exploring the rise of dental aesthetics and women’s grillz for Gen Z

By Charlie Sawyer

Video of teenage girls using makeup to put on blackface in Sephora goes viral

By Fatou Ferraro Mboup

Are Drake and Camila Cabello dating or is a collaboration on the horizon? Let’s look at the facts

By Charlie Sawyer

Gun safety expert warns how crucial Gen Z’s vote will be in 2024 US presidential election

By Lois Freeman

The ugly path to freedom: How I finally ended my teenage eating disorder

By Charlie Sawyer

Conservatives are spreading dangerous misinformation about birth control on TikTok

By Fatou Ferraro Mboup

AI used to resurrect dead Indian politician M. Karunanidhi ahead of elections

By Charlie Sawyer

How much does it cost to attend the 2024 Met Gala? Why this year’s event is set to be the messiest one yet

By Abby Amoakuh

What is girl therapy? The TikTok trend disguising middle-class consumerism as self-care to Gen Z