Facial Recognition Is Everywhere — But So Are Tools To Defeat It

The new Privacy Visor is one of many devices designed to thwart facial detection systems

Photo Illustration: R. A. Di Ieso
May 03, 2017 at 10:48 AM ET

Apart from celebrities and high-profile fugitives, most people take for granted the ability to walk around in public without being identified by strangers, blissfully detached from their names and personal history. But more recently, this basic concept of public anonymity has been rapidly eroding.

Thanks to the rise of social media, ubiquitous cameras, internet-connected devices, and massive police facial recognition databases, more than half the U.S. adult population can now be near-instantly identified and tracked on the street simply by revealing their face. In response, privacy-minded engineers and activists have been fighting back with tech of their own.

“You can see a crumbling border between the cyber and physical worlds due to the spread of advanced sensors,” said Professor Isao Echizen during a recent talk at the International Workshop on Obfuscation at NYU Law School.

Before becoming a researcher at Japan’s National Institute of Informatics, Echizen spent a decade working in a research and development lab at Hitachi, where he designed copyright protection systems for the Japanese electronics giant. More recently, he’s been using his experience defending corporate intellectual property to defend people’s privacy.

In 2012, Echizen and his colleagues unveiled a prototype of the Privacy Visor, a bizarre-looking pair of glasses that defeats face detection systems by blasting camera sensors with beams of near-infrared light, which are invisible to the human eye.

The visor worked, but it wasn’t exactly subtle or flattering to wear. Commercial face detection algorithms have also evolved since then, and computer vision researchers have used advanced machine learning systems like artificial neural networks to successfully thwart various anti-face detection techniques — including Echizen’s prototype.

Now, after years of development, Echizen has unveiled an improved version of the Privacy Visor that doesn’t require power or use any electronics at all. Instead, the new model — which he officially released to market in March — uses repeating white patterns printed on a plastic transparency. The dense patterns reflect light back at the camera’s sensor, causing enough noise to prevent many algorithms from successfully detecting faces.

When I got my hands on one of Echizen’s Privacy Visors, I had almost no difficulty fooling the face detection schemes used by popular social media platforms. The puppy faces and other cute video filters provided by Snapchat’s face-detecting Lenses quickly disappeared after lowering the visor onto the center of my face — though it often needed a bit of adjustment. Facebook’s algorithm also didn’t detect any faces in uploaded photos of people wearing the visor from various angles and distances.

But as situations change and technology improves, experts say there’s no permanent “silver bullet” solution to the problem Echizen’s Privacy Visor is trying to solve. In a paper published last September, researchers at Cornell University built an artificial neural network that can de-obfuscate and match face images with up to 95 percent accuracy, even at extremely low resolutions.

“There’s no approach that ‘just works,’ or anything close to it,” Lujo Bauer, a researcher at Carnegie Mellon University who recently co-authored a paper detailing new methods of defeating facial recognition, told Vocativ.

That’s partly due to the old paradox of obfuscation: if you’re the only one actively trying to hide from surveillance technologies like facial recognition, you’re way more likely to stand out.

“In general, the more effective the approach, the more likely it is to be conspicuous to those nearby,” said Bauer. “A face recognition algorithm may not immediately identify you, but bystanders may gawk, and a human looking at a video feed would also likely notice that something suspicious is going on.”

Nevertheless, researchers have continued to come up with new methods and technologies for thwarting face detection and recognition. In a recent paper, Bauer and his collaborators designed a set of 3D-printed glasses with patterns that reduce the effectiveness of facial recognition — the algorithms that try to identify people in a photo, after their faces have been successfully detected.

Artist and privacy researcher Adam Harvey is perhaps best known for his CV Dazzle anti-face detection makeup patterns. Instead of preventing algorithms from detecting faces, his most recent experiment causes them to detect too many.

The project, called HyperFace, is a textile pattern containing vaguely face-shaped assortments of pixels that form “ideal” face template images, as understood by the most commonly-used face detection method, Viola-Jones. This turns the material into a kind of camouflage, hiding the wearer’s face by fooling the algorithm into detecting false ones.

Still, Harvey says it’s hard to convince large numbers of people to wear the 21st century equivalent of a tin foil hat. With future versions of HyperFace, he’s aiming to make the “faces” on the material less visible using colored patterns — part of a continuing effort to “make tin foil look really good,” he says.

“There’s a lot of propaganda coming from the FBI and Facebook that you have no right to hide,” said Harvey during a talk at the NYU Obfuscation workshop. “Going against that narrative has always been difficult because Facebook is popular and the FBI is powerful.”

The effectiveness of any anti-facial recognition tool ultimately depends on what kind of adversary you’re trying to protect against, and how badly they want to find you. While there’s no practical way to hide from a well-resourced government agency, Bauer says it’s difficult — but still not impossible — to hide from automated systems that identify and track people en-masse.

“Suppose I don’t want to be identified in a crowd at a protest. If I’m at the protest for four hours, it means that if any [camera] captured me in the four hours, I have to fool all of them,” Bauer told Vocativ. That can be extremely difficult, he said, since remaining undetected would mean successfully avoiding facial detection from all possible camera angles and at various distances — not to mention making sure your anti-facial detection visor or glasses doesn’t slip off for a brief moment.

Nevertheless, Bauer thinks there’s good reason for people to want anti-facial recognition tech. And he has no doubt creative solutions like Privacy Visor will continue to pop up as the cat-and-mouse game continues.

“For very specific situations, I believe there will continue to be solutions, but those are likely to be of more benefit to people seeking to cause harm than to those seeking to maintain their privacy,” he said. “To maintain privacy in general, however, we’d have to be evading facial recognition all the time, regardless of who’s doing it and what algorithm they’re using — and that’s much harder.”

The new season of DARK NET — an eight-part docuseries developed and produced by Vocativ — airs Thursdays at 10 p.m. ET/PT on SHOWTIME.