PimEyes’ decision to make facial-recognition software available to the general public crosses a line that technology companies are typically unwilling to traverse, and opens up endless possibilities for how it can be used and abused.
Imagine a potential employer digging into your past, an abusive ex tracking you, or a random stranger snapping a photo of you in public and then finding you online. This is all possible through PimEyes: Though the website instructs users to search for themselves, it doesn’t stop them from uploading photos of anyone. At the same time, it doesn’t explicitly identify anyone by name, but as CNN Business discovered by using the site, that information may be just clicks away from images PimEyes pulls up.
«Using the latest technologies, artificial intelligence and machine learning, we help you find your pictures on the Internet and defend yourself from scammers, identity thieves, or people who use your image illegally,» the website declares.
It’s precisely this ease of access that concerns Clare Garvie, a senior associate at Georgetown Law’s Center on Privacy and Technology, who has extensively researched police use of facial-recognition technology.
«Face recognition at its foundation is a tool of identification,» Garvie told CNN Business. «Think of any reason a person would want to conduct an identification — positive and negative — and that’s what this tool makes possible.»
«A creepy stalking tool»
The images come from a range of websites, including company, media and pornography sites — the last of which PimEyes told CNN Business that it includes so people can search online for any revenge porn in which they may unknowingly appear.
PimEyes’ ease of access and the lack of enforcement of its own search rules makes it a tool primed for online stalking and surveillance, said Lucie Audibert, legal officer with London-based human rights group Privacy International.
«In the hands of random citizens, like you or me, it becomes a creepy stalking tool where you can identify anyone on the streets or in any public space,» Audibert said.
To get a sense for what PimEyes can do and how well it works, CNN Business paid for the $29.99-per-month individual subscription, which gave me the ability to conduct 25 «premium» searches per day, see all the search results PimEyes dredged up from around the internet, and the ability to set up alerts for any new images that PimEyes comes across.
I conducted multiple searches for my face online, using new and old photos featuring different hairstyles. In some I wore glasses; in others I did not. Sometimes, before PimEyes would conduct a search, a pop-up forced me to check two boxes saying I accepted the site’s terms of service and that I agreed to use a photo of my face to conduct the search.
The results that were actually pictures of me (and not, say, pornographic images of similar-looking women, of which there were plenty) were mostly familiar. These included work-related headshots, still images from videos I recorded while testing gadgets years ago, and a picture of me smiling with my high school journalism teacher.
My eyes appear closed and I’m wearing black glasses. It’s a blurry image, but it’s definitely me.
With PimEyes, I could trace a selfie to my identity with just a few clicks. As a journalist with headshots and biographies at multiple publications’ websites, it’s pretty easy to connect my face to my name online. So I tried again with the image of a friend (after first getting his consent) who works in another field and has a smaller online presence; one of the first results was from his website, which has his name in the URL.
Shrouded in secrecy
I wanted to learn more about how PimEyes works, and why it’s open to anyone, as well as who’s behind it. This was much trickier than uploading my own face to the website. The website currently lists no information about who owns or runs the search engine, or how to reach them, and users must submit a form to get answers to questions or help with accounts.
Poring over archived images of the website via the Internet Archive’s Wayback Machine, as well as other online sources, yielded some details about the company’s past and how it has changed over time.
The shift makes sense to Garvie, who pointed out that, initially, Clearview AI was more widely available than it is now (she knows someone, she said, outside of law enforcement, who had the app on his phone).
They refused to conduct a formal interview, saying they «don’t take part in live interviews or direct interviews,» but that they would answer questions sent via email. Over multiple messages they answered a number of questions, but ignored or sidestepped others, such as why the company had switched its focus from suggesting users search for anyone to searching just for yourself.
They would not say how much they paid to purchase PimEyes from its prior owners, nor why they bought it, though they did write the company is currently based in the Seychelles due to the country’s «good incorporation environment.»
When asked where employees are actually based, they answered that PimEyes has an «international team, but we don’t want to disclose details.»
They confirmed that the facial-recognition search engine works similarly to other such systems, by comparing measurements between different facial features in one image (the one you upload) to those in others (in this case, ones it has found online). In order to match up the faces that users submit, PimEyes must scour the internet for images of people. PimEyes doesn’t save images from around the internet, they explained, but it does keep an index of facial-feature measurements from photos it has spotted on the web.
This kind of AI-driven image-matching is different from what happens when you upload a picture of yourself to a site such as Google Images and conduct a search: There, the results will include pictures of similar people (for me, that means lots of dark-haired women in glasses), but Google isn’t using facial measurements in the hopes of finding you, specifically, in other pictures online.
When PimEyes’ search engine finds a match between the photo a user uploads and one PimEyes has previously seen online, it can pair the measurements of the previously analyzed photo with the web address where that photo is located. The website shows you an array of all the pictures it thinks look most like your own photo.
The search accuracy, the company claimed, is about 90%; in general, the accuracy of facial-recognition technology depends on many factors, such as the quality of face images that are fed into a system.
They would not name any paying business customers, only saying that «there are no law enforcement agencies among them».
«It is naive to think that if our search engine didn’t exist, harassers wouldn’t break the law,» they wrote. «On the other hand — we are available to everyone, so any victim of harassment or other internet crime can check themselves using our search engine.»
Connecting names and faces
This accessibility is precisely what concerns Audibert, of Privacy International, and Garvie, of Georgetown. One of Audibert’s biggest concerns about PimEyes, she said, maybe even more so than with Clearview, is whose hands it could fall into. People could use it to identify others in public places, she points out, while private companies could use it to track people.
Garvie, who used PimEyes on an image of her own face, noticed that most of the results that were not her were of similar-looking White women in their 30s. This type of misidentification is common across facial-recognition algorithms, she said, and also makes it more likely that a person who sees those results will then make a misidentification.
PimEyes’ technology could hurt people in other ways, too, such as by outing people who are transgender — intentionally or not. When Rachel Thorn, a professor at Kyoto Seika University, uploaded a recent photo of herself to PimEyes, she encountered other recent images of herself. There were also older images, she said, where she presented as masculine. She looks very different today, she said, but guessed that PimEyes may have picked up on similarities between facial features in a recent photo and old photos.
«As a transgender person it was not a great feeling to see old photos of myself show up. I’m pretty sure almost any transgender person would feel the same way,» she said.
Thorn, who studies Japanese graphic novels, known as manga, was impressed by the technology but also worried about how it could be abused. And since the site didn’t stop her from uploading anyone else’s image, she did: She looked up an acquaintance who had worked in pornography by uploading a selfie that person sent her. Sure enough, pornographic images of her friend popped up.
«I thought, ‘Oh my gosh’,» she said. «If you wanted to find out if someone had ever done work in porn, this would do it.»