Press "Enter" to skip to content

AI named after V For Vendetta masks protects photos from being gathered by facial recognition apps

Clearview AI is just one of many facial recognition firms scraping billions of online images to create a massive database for purchase – but a new program could block their efforts.

Researchers designed an image clocking tool that makes subtle pixel-level changes that distort pictures enough so they cannot be used by online scrapers – and claims it is 100 percent effective.

Named in honor of the ‘V for Vendetta’ mask, Fawkes is an algorithm and software combination that ‘cloaks’ an image to trick systems, which is like adding an invisible mask to your face.

These altered pictures teach technologies a distorted version of the subject and when presented with an ‘uncloaked’ form, the scraping app fails to recognize the individual.

‘It might surprise some to learn that we started the Fawkes project a while before the New York Times article that profiled in February 2020,’ researchers from the SANLab at University of Chicago shared in a statement.

‘It is our belief that is likely only the (rather large) tip of the iceberg. Fawkes is designed to significantly raise the costs of building and maintaining accurate models for large-scale facial recognition.’

‘If we can reduce the accuracy of these models to make them untrustable, or force the model’s owners to pay significant per-person costs to maintain accuracy, then we would have largely succeeded.’

The ‘V for Vendetta’ mask is a popular disguise among activist and Anonymous, and was the face of many protesters during the New York occupy movement in 2011. 

And it seems fitting that the University of Chicago would name their program in the mask’s honor.

Fawkes takes a photo and makes tiny, pixel-level changes that are not visible to the naked eye – a process called ‘image cloaking.’

Users upload an image to the program and the system makes the alterations in just a few minutes.

Fawkes co-creator Ben Zhao, a computer science professor at the University of Chicago, said: ‘What we are doing is using the cloaked photo in essence like a Trojan Horse, to corrupt unauthorized models to learn the wrong thing about what makes you look like you and not someone else.’

‘Once the corruption happens, you are continuously protected no matter where you go or are seen.’

Researchers say Fawkes has stumped generally benign facial recognition systems used by Facebook, Microsoft and Amazon, along with Clearview AI’s system.

The team is working on a user-friendly version of the program for both Mac and Windows.

Fawkes is currently available as source code, and you can compile it on your own computer.

Clearview AI came under fire earlier this year, when a report surfaced that more than 600 US police departments were using the facial recognition app that is capable of comparing uploaded photos with three billion images in its database culled from social media and other websites.

Clearview AI allows users to take a photo of a person and upload it to the app, which then matches it up to to publicly-available photos of that person, displaying those images along with links to where they appeared online.

The publicly-available photos of people are said to be in a database that Clearview pulled together from outlets such as Facebook, Instagram and Twitter, but also Venmo, YouTube, employment and educational websites and supposedly millions of other online sites.

Clearview could be used to identify activists at rallies, but also random, attractive passersby, providing not just names and addresses, but also what they do and who they know, according to the newspaper.

The app’s website states that the technology is a ‘new research tool used by law enforcement agencies to identify perpetrators and victims of crimes’ and that it has helped those agencies capture hundreds of criminals, while also exonerating innocent people and helping to identify the victims of crimes.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *