How to get around facial recognition

From colourful make-up to invisibility cloaks, attempts to fool facial recognition software are widespread, but can these really help to keep our biometric data private?


Facial recognition

A decade ago, it was possible to attend a protest in relative anonymity. Unless a person was on a police database or famous enough to be identified in a photograph doing something dramatic, there would be little to link them to the event. That’s no longer the case.

Thanks to a proliferation of street cameras and rapid advances in facial recognition technology, private companies and the police have amassed face data or faceprints of millions of people worldwide. According to Big Brother Watch, a UK-based civil liberties campaign group, this facial biometric data is as sensitive as a fingerprint and has been largely harvested without public consent or knowledge.

Face off: The quest to mess with facial recognition technology

In response, designers and privacy activists have sought to make clothing and accessories that can thwart facial recognition technology. According to Garfield Benjamin, a post-doctoral researcher at Solent University, who specialises in online privacy, they rely on two main techniques.

“Either they disrupt the shape of the face so that recognition software can’t recognise a face is there or can’t identify the specific face,” he says. “Or they confuse the algorithm with different patterns that make it seem like there are either hundreds or no faces present.”

Public attitude to facial recognition

At the University of Maryland, Tom Goldstein, associate professor in the Department of Computer Science, is working on the second technique. He’s created a so-called invisibility cloak, though in reality it looks more like an incredibly garish hoodie. The cloak, a research tool which is also sold online, works by fooling facial recognition software into thinking there isn’t a face above it.

In 2015, when Scott Urban, founder of Chicago-based privacy eyewear brand Reflectacles, saw facial recognition becoming “more popular and intrusive”, he set out to make glasses that would “allow the wearer to opt out of these systems”.

He created a model designed to block 3D infrared facial scanning, used by many security cameras, by turning the lenses black. While another model reflects light to make it harder to identify a user’s face data from a phone picture.

Other anti-surveillance designs include a wearable face projector, which superimposes another face over that of the person wearing the device, a transparent mask with a series of curves that attempts to block the facial recognition software while still showing the user’s facial expressions, balaclavas with a magnified pixel design and scarves covered in a mash up of faces.

IRepair glasses
The IRpair glasses by Reflectacles are designed to block 3D infrared facial scanning

The anti-spoofers strike back

Benjamin says the problem with all these techniques is that the companies making the facial recognition technology are always trying to improve their systems and overcome the tricks, often boasting in their promotional literature about the anti-spoofing mechanisms they are working on. “They want to show they’re thwarting the ‘rebels’ or ‘hackers’ and this has led to further developments in the technologies,” he says.

This was the case with CV Dazzle, which uses face paint to trick or dazzle the computer vision by disrupting the expected contours, symmetry and dimensions of a face. The technique was invented by the American artist and activist Adam Harvey in the early-2010s and it proved to be effective at confusing the software that was emerging at the time, though it’s creator has noted it doesn’t always fool present-day tech.

It’s about making that invisible tech visible… especially as the Met Police are starting to deploy these cameras in the city

Yet, it does still disrupt the facial tagging of some social media, according to Georgina Rowlands of The Dazzle Club, a UK-based privacy activist group inspired by Harvey. “We know the technique is still effective versus Facebook, Snapchat and Instagram’s algorithms,” says Rowlands, whose group lead monthly walks, adorned in their rather striking Bowie-esque face paint, around London to explore privacy and public space in the 21st century. “But we haven’t been able to access more advanced systems such as the Metropolitan Police’s, so we can’t say if it’s effective there.”

Awareness around facial recognition issues

But evading the tech is only part of the story for The Dazzle Club. It’s as much about raising awareness of the pervasiveness of facial recognition software. As another member of the group Emily Roderick says: “It’s about making that invisible technology visible and bringing out those discussions, especially as the Met Police are starting to deploy these cameras in the city.”

The real goal for many of these creators is regulation of facial recognition technology companies and those who use the faceprints, to protect the privacy rights of the individual. So whether someone is at a protest or simply walking down the street, they can trust that their face, and all the data contained within it, remains their own and theirs alone.