In 2022, every dawn is ridden by a new controversial AI tool on the internet. Gone are the days netizens leveraged DALLĀ·E 2 and racist image-spewing DALLĀ·E mini to generate silly art for Twitter shitposting. Instead, recent months have witnessed the deployment of these tools to win legit art competitions, replace human photographers in the publication industry, and even steal original fan art in a growing series of ethical, copyright, and dystopian nightmares.
Just when we thought the innovations on the AI generator front may have come to a relative standstill, a new tool is now roasting people beyond recovery solely based on their selfies.
Dubbed CLIP Interrogator and created by AI generative artist @pharmapsychotic, the tool essentially aids in figuring out āwhat a good prompt might be to create new images like an existing one.ā For instance, letās take the case of the AI thief who ripped off a Genshin Impact fan artist by taking a screenshot of their work-in-progress livestreamed on Twitch, feeding it into an online image generator to ācompleteā it first, and uploading the AI version of the art on Twitter six hours before the original artist. The swindler then had the audacity to accuse the artist of theft and proceeded to demand credit for their creation.
With CLIP Interrogator, the thief could essentially upload the ripped screenshot and get a series of text prompts that will help accurately generate similar art using other text-to-image generators like DALLĀ·E mini. The process is a bit cumbersome but it opens up a whole new realm of possibilities for AI-powered tools.
On Twitter, however, people are using CLIP Interrogator to upload their own selfies and get verbally destroyed by a bot. The tool called one user a ābeta weak male,ā a second āextremely gendered with haunted eyesā and went on to dub a third āJoe Biden as a transgender woman.ā It also seemed to reference porn websites specifically when hit up with images of females with tank tops. Are we surprised? Not in the least. Disappointed? As usual.
Since I donāt exactly trust an AI with my own selfies (totally not that I canāt handle the blatant roasting or anything), I decided to test the tool by uploading some viral images of public figures. On my list were resident vampire boi Machine Gun Kelly (MGK), his best bud Pete Davidson, and of course selfie afficiendao, Kim Kardashian.
After several refreshes and dragging minutes of āError: This application is too busy. Keep trying!ā I finally got CLIP Interrogator to generate text prompts based on one of MGKās infamous mirror selfies. āNon-binary, angst white, Reddit, Discordā the tool spat.
Meanwhile, the American rapperās bud Davidson got āYung lean, criminal mugshot, weirdcore, pitbull, and cursed imageā to name a few. For reference, the picture in question was the shirtless selfie that the Saturday Night Live star took to hit back at Kanye West while he was dating Kim Kardashian. Talking about the fashion mogul, Kardashianās viral diamond necklace selfie was described by the AI tool as āinspired by Brenda Chamberlain, wearing a kurta, normal distributions, wig.ā
As noted by Futurism, CLIP Interrogator is ābuilt on OpenAIās Contrastive Language-Image Pre-Training (CLIP) neural network that was released in 2021, and hosted by Hugging Face, which has dedicated some extra resources to deal with the crush of traffic.ā As the tool remains over-trafficked, further details are hazy at this point.
All we know for sure is that the roast bot has a long way to go when it comes to biases, especially when used by netizens to comment on their own selfies. And given how Twitter has recorded 320 tweets under the search term āCLIP Interrogatorā as of today, it seems like the tool is here to stay for a while.