Introducing PimEyes, the seedy search engine used by paedophiles to stalk their victims – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges


Introducing PimEyes, the seedy search engine used by paedophiles to stalk their victims

An investigation by The Intercept has put PimEyes at the centre of a troubling scandal that eases accessibility to child sex abuse material online.

What is PimEyes?

PimEyes is a search engine, but in a world of Googles, Bings and Yahoos, this particular AI invention is in a whole dystopian-like world of its own—a tool that has the ability to search and find any face deep in the murky corners of the internet. For $29.99 a month, PimEyes allows you to conduct a search within mere seconds with just one photo.

The New York Times put the search engine to the test and, with the consent of its journalists, went about finding more photos of them on the web. Uploading the image of the person you wish to find, as well as accepting the terms and conditions, will draw up a grid-like framework of faces the search engine has deemed similar to the original provided. And for the publication, the results were instant and deeply concerning.

PimEyes had managed to locate multiple photos of each journalist, providing image results that some of the individuals had never seen before. What’s worse is that it was able to skilfully and accurately accomplish this without even needing a clear initial image for the search. The New York Times made note that despite sunglasses or masks being used to hide the face or even when using images where faces were turned away from the camera, PimEyes was still able to use such a picture to accurately find more of the person in question.

It was initially created in 2017 by two Polish developers and then later moved through various owners until it landed in the hands of international relations scholar Giorgi Gobronidze, who had met the pair behind the engine while lecturing in Poland.

PimEyes is Clearview AI, but for the public

Put simply, the search engine has been described by many as the ‘public Clearview AI’, a powerful facial recognition tool only available to law enforcement. Clearview AI’s latest largely publicised use has been in the context of war, with Ukraine making headlines for its use of the AI technology against Russia.

This in itself does not mean it is any less dangerous than PimEyes—the use of facial recognition in government concerns those who would predict a Big Brother-like future. However, where Clearview AI’s data is only available to respective authoritative bodies, PimEyes offers this access to everybody—and with little regulation, it’s only a matter of time before it manifests into a recipe for disaster. In fact, the new owner of the search engine, Gobronidze, said there was a heavy reliance (and expectation) on individuals to just simply act “ethically” when using PimEyes, as per The New York Times.

It is these murky and unregulated waters that have curated an environment where use of the search engine runs dangerously rampant.

Is PimEyes being used for good or for evil?

Following a thorough and comprehensive investigation by The Intercept, the use of PimEyes and its unregulated world has been, once again, called into question. The search engine used as a weapon of abuse has been noted by the publication—a tool that has proven to “pull up ‘potentially explicit’ photos of kids” in its searches.

Using fake images of children to conduct its searches, The Intercept found that PimEyes allowed any person to search for images of minors from across the internet. A terribly concerning discovery, the publication cited. The fake image used in the search actually pulled up real results of children from various sources like blogs and personal family websites. If it can do this with fake images of children, imagine what it could do with real images.

What made matters even worse was that PimEyes would even provide the source link from which the image was found; theoretically making it an incredibly easy pathway for stalkers and predators to find and locate the children in the images. Not only could this mean that predators can source more illicit imagery of the same child, but it could also be potentially used to stalk their victims. The aforementioned dirty and unclear waters of this device only ran dirtier following The Intercept’s interview with the AI tool’s owner.

In what was described as a “vague and sometimes contradictory account” of PimEyes’ privacy regulations, Gobronidze claimed its primary use was not in the search of others but to search one’s own image internet footprint and clean it up where wanted. He suggested that the primary subscriber count was made up of women and girls in pursuit of revenge porn images of themselves. However, according to The Intercept there are no controls in place to prevent the search of anyone else besides yourself.

Does PimEyes search social media?

If you could believe it, the search engine actually happened to be even worse prior to the removal of certain elements. According to research and reports found by Futurism, upon facing earlier backlash, developers of the program decided to scale back on the search power of PimEyes.

Prior to its warranted criticism, the search engine was even able to crawl social media sites for images—and per SCREENSHOT’s investigation into the world of paedophile rings preying on kidfluencer content on social media—it’s safe to say, The Intercept’s findings could have been even wider if this had still been the case. In fact, Gobronidze suggested to the publication that though it admittedly is “tailor-designed for stalkers,” PimEyes had “subsequently cleaned up its act by no longer crawling social media,” Futurism reported.