Facial Recognition Cameras Are Now Watching Your Emotions

Systems originally developed to identify people from photos can now detect gender, emotions, and much more

Photo Illustration: Diana Quach
May 30, 2017 at 2:47 PM ET

As facial recognition technology becomes more commonplace, some companies are seeking an edge up on their competition by branching out — claiming to recognize a person’s emotional state, age, gender, and even criminal tendencies.

One such company is Russia’s NTechLab, which became infamous last year when it released FindFace. A creepy facial recognition app that uses the Russian social network VKontakte — a Russian analogue to Facebook — FindFace lets users identify people on the street simply by taking their photo.

But like many in the facial recognition industry, NTechLab has been quickly expanding its technology to do far more than just ID random passersby. With recent upgrades to its technology platform, the company says it’s now offering police and corporations the power to automatically detect the emotions, gender, and age of anyone walking past a CCTV camera.

The move is an alarming expansion of the facial recognition tech demonstrated by the FindFace app, which was banned by Twitter and has been used by online harassers to identify female sex workers and adult performers. The company brags it can detect a person’s gender with 99 percent accuracy, and their age with 95 percent accuracy within a range of 3 years — though experts doubt those claims.

Using its real-time facial recognition feature, which it calls “friend or foe,” the platform is designed for face verification in secure areas to detect unauthorized persons. A spokesperson for the company told Vocativ that its use could be expanded, though, by anyone looking to “detect potential criminals and fugitives by marking them as suspicious if they express emotions like fear, hatred, or nervousness.”

More TASER Rebrands, Offers Body Cameras To Every Cop

NTechLab is far from the only company deploying these controversial techniques, and many facial recognition systems have already gone beyond their original purpose of identifying people. Microsoft, Amazon, and Google each have image analysis platforms that claim to detect emotions, gender, age, attention, and other metrics with a high degree of accuracy. Boston-based Affectiva, an AI company that focuses on emotional analysis, offers emotion detection as a cloud service, allowing customers to upload massive archives of surveillance video and get back analytics tracking the emotional states of everyone appearing in them.

Lujo Bauer, a researcher who studies and develops computer vision systems at Carnegie Mellon University, says he doubts NTechLab’s claims of high accuracy facial analysis. But he notes that many of these capabilities have been available for quite some time — including in systems he and his team developed.

“Flagging people on a watchlist is a straightforward feature. There already exist surveillance cams (or software that accompanies them) that automatically identify the people who appear on camera,” Bauer told Vocativ. “Of course, this requires each person to be recognized to first be enrolled by showing the system a bunch of images of that person.”

Companies like the Israeli firm Faception have taken this even further, dubiously claiming to have built AI capable of making judgments about people’s likelihood of committing a crime based on the shape of their facial features. But those efforts that have been widely discredited and condemned by ethicists and machine learning experts, who describe them as a computer-aided resurgence of racist and long-debunked pseudoscience.

More Police Body Camera Giant Made Lawyers Sign Away Client Footage

NTechLab claims it already has more than 100 pilot programs running their facial analysis platform, including a biometrics firm that works with the Turkish government, a Russian casino, and a client that “works with [talent] agencies to find models and actors with particular features.” The company told Vocativ it has also to provided the technology to law enforcement and government agencies outside Russia, but refused to name the specific clients, citing nondisclosure agreements.

No matter their originally stated purpose, privacy researcher and artist Adam Harvey says it’s now trivially easy for facial recognition systems to incorporate face analysis. Both technologies are now part of the same biometric technology pipeline, and facial recognition has become merely an “entry level” task, he says.

“Whether an advertising company is offering an emotional analysis product or a security contractor is offering a facial recognition service, there’s not much difference between the two systems,” he told Vocativ. “Once you have the facial analysis pipeline, adding more algorithms is trivial.”