A Kremlin-Linked Mogul Made A Selfie App That Reads Your Emotions

Magic wants to target users with emotion-based advertising — and comes from one of the Kremlin's favorite tech entrepreneurs

Photo Illustration: R. A. Di Ieso
Apr 10, 2017 at 4:41 PM ET

An irritating, dystopian future where advertisers profile and target you by reading your facial expressions is coming. And once it’s here, you can blame trendy face-tracking photo booth apps.

Much like Snapchat’s Lenses and a conveniently similar Facebook feature, a new app called Magic uses face detection to overlay users’ faces with silly, interactive digital masks. The app reacts to changes in the user’s facial expression, triggering animated tears and hearts when you do things like open your mouth, smile, pout, or make a kissy face.

Inspired by the runaway success of augmented reality apps like Pokémon Go, Magic’s creators claim it’s “the first app that detects facial expressions in real-time and augments your videos with emotion based animations.” But they’re also pitching something else that makes consumers and privacy advocates nervous: emotion-based advertising.

For now, Magic Unicorn, the parent company behind Magic, says it’s looking to make money through brand-sponsored content. But the app’s creators also plan to woo advertisers with what co-founder Ashot Gabrelyanov calls “emotions based targeting.”

“In other words, clients can choose to advertise to users who are happy, sad, et cetera,” Gabrelyanov told Vocativ. “For example, we noticed many of our users expressed sadness on Valentine’s day. Dating services could leverage this information and better target their marketing spend.”

Magic is far from the first company to try to cash in on the emotion tracking trend. Marketers have called the ability to capture and analyze customers’ emotional states “the future of retail.” High-end fashion chains like Saks Fifth Avenue have added face recognition capabilities to their stores’ security cameras, hoping to catch shoplifters, track VIP customers, and see how shoppers react to different products. And emotion recognition startups like Affectiva even offer emotion analytics as a service, allowing companies to annotate their massive archives of surveillance video with their customers’ emotional states.

The recent rise of face detecting photo booth apps opens another possible avenue for emotion capture. While it’s no secret that these apps use real-time face and object detection to apply their quirky video filters, the feature could make it easy for companies like Facebook to record users’ emotions, potentially opening the floodgates to a new era of emotion-based advertising and surveillance.

A Facebook representative told Vocativ that the company doesn’t gather any data about users when they activate selfie masks, saying that all the algorithms are run locally on the user’s device. The spokesperson further added that “Facebook doesn’t use any face recognition information for ad targeting at this time.” And Snapchat’s parent company, Snap, says that it doesn’t retain information about a user’s face when they activate Lenses. Snap also doesn’t infer emotional states based on that information, a spokesperson told Vocativ.

Magic’s privacy policy doesn’t list what types of data the app collects. But a brief look at its co-founder’s past ventures doesn’t exactly inspire confidence.

Now living in Brooklyn, Gabrelyanov is the son of Russian media mogul Aram Gabrelyanov, whose media conglomerate, News Media, is funded by two billionaire friends of President Vladimir Putin. The junior Gabrelyanov is also the founder and former CEO of LifeNews, now simply called Life, a pro-Kremlin outlet that pushed conspiracy theories about Putin’s political rival, Boris Nemtsov, in the months leading up to his assassination, among other falsehoods

While Gabrelyanov makes no apologies for his past, he hopes the shift to emotion recognition tech will separate him from what he calls the “political baggage” of Russian media. He claims that Magic has received no support from LifeNews or any of its financial backers, telling Vocativ via email that the app has so far been entirely “bootstrapped.”

Gabrelyanov made various promises of built-in privacy protection, saying that Magic is “sensitive not to capture controversial or intimate data,” and that the app only collects “analytics data,” including usage information like how many times an emotion is expressed in a given session. He also claimed that the app “can’t infer user identity from the confluence of usage data that we do capture.”

But privacy advocates worry that could easily change, and not everyone might realize they’re agreeing to have their emotions captured and sold simply by downloading an app.

“There’s been an explosion of technology that can capture our biometrics, find out who we are, and figure out things about us,” Adam Schwartz, a staff attorney for the Electronic Frontier Foundation, told Vocativ. “If I want somebody to be able to use my emotions, and I’ve consented to it, great. On the other hand, if I haven’t given consent, I don’t want a store taking my photo and figuring out who I am.”

Right now, only two U.S. states, Illinois and Texas, have laws restricting the commercial use of face recognition and other biometric technologies. Illinois’s law, the Biometric Information Privacy Act, or BIPA, is the stronger of the two, and is at the center of an ongoing lawsuit against Facebook, Google, Shutterfly, and other companies that gather biometric data.

BIPA requires companies to acquire customers’ informed written consent before collecting, sharing, or using biometric information. That runs contrary to Facebook’s practice of mining user photos for the facial templates it uses for its “tag suggestions” feature.

Under BIPA, Schwartz says, companies would need to obtain distinct consent to perform emotion analysis. But it’s unclear if other states will follow Illinois’ example, and whether photo booth apps like Magic are affected will depend on how they handle face detection and analysis.

“If an application observes a face and then sticks horns on the head, that’s not facial recognition,” Schwartz said. “What crosses the line is when they take the particular faces and apply software to extract information about that person.”

Update April 11, 2017: This article was updated to include statements from Facebook