PRIVACY

Want To Hide From Face Recognition? Try An Anti-Surveillance T-Shirt

An artist is developing textile patterns that confuse algorithms by spamming them with false faces

PRIVACY
HyperFace Prototype by Adam Harvey / ahprojects.com
Jan 04, 2017 at 4:46 PM ET

With the rise of predictive algorithms and face recognition technology, we’ve been slowly sleepwalking into a dystopian world where anyone can be identified, tracked and profiled, simply by walking past a CCTV camera on the street.

Privacy advocates have long anticipated this, warning of the unprecedented powers that unchecked face recognition will give to governments, police, criminals and corporations. But in the not-too-distant future, wearing a coat covered in algorithmically-designed abstract patterns could be all you need to shroud yourself from a face recognition dragnet.

More Study: 1 in 2 American Adults Already In Facial Recognition Network

That’s the goal of HyperFace, a new project from Berlin-based artist Adam Harvey that aims to produce fabric patterns that confuse face detection technology. Developed in collaboration with Hyphen Labs for an exhibit at this year’s Sundance Film Festival, Harvey describes the patterns as “a new kind of camouflage,” designed to sabotage computer vision algorithms by feeding them false faces, making it much harder for them to detect which “face” is real.

Harvey is no newcomer to counter-surveillance fashion. In 2010, he started work on CVDazzle, a series of makeup patterns designed to break up the symmetry of the wearer’s face, making it effectively invisible to face detection algorithms. This is done by disrupting the patterns that face detection algorithms normally search for, lowering the “confidence score” they assign to determine whether or not an image contains a face.

But that approach to anti-face recognition has technical limitations, especially as computer vision algorithms become increasingly sophisticated. And while the makeup patterns might obscure people’s faces from machines, it simultaneously makes them hyper-visible to humans.

HyperFace takes the opposite approach: Rather than trying to reduce an algorithm’s ability to detect a face, Harvey’s new designs feature patterns that common computer vision algorithms will detect as multiple faces with a high degree of confidence, causing the wearer’s actual face to be lost in the crowd. In machine learning, these abstract patterns are known as “classifiers” — essentially a highly-optimized template created by mashing together a large number of example images to detect a specific type of object or entity.

More New York’s New Facial Recognition Software Led To 100 Arrests

“Instead of seeking computer vision anonymity through minimizing the confidence score of a true face (i.e. CV Dazzle), HyperFace offers a higher confidence score for a nearby false face by exploiting a common algorithmic preference for the highest confidence facial region,” writes Harvey in his description of the project. “In other words, if a computer vision algorithm is expecting a face, give it what it wants.”

Harvey’s project comes at a critical moment when commercial and government face recognition is on the verge of ubiquity. Some retail stores already use face recognition to detect customers’ emotional responses while shopping, and companies like Affectiva offer “emotion detection” services that can analyze the facial expressions of people that appear in archived videos. The dangers of widespread face recognition are especially apparent in the hands of law enforcement. A recent report from researchers at Georgetown University revealed that half of all American adults are currently in a law enforcement face recognition database. And some researchers have begun pushing scientifically-disputed and ethically dubious algorithms that claim the ability to predict whether a person will commit crimes based only on their facial features.

HyperFace isn’t a tested and proven solution yet, however. Harvey cautions that the patterns are only designed to combat specific image recognition algorithms, and the one pattern he has released is still a prototype. But says he has plans to share more example images near the end of January, once they’ve been more thoroughly tested.