Social Media

Mess With Facebook’s Emotion Tracking By Randomizing Your Feelings

'Go Rando' will confuse the company — and your friends — by mixing up your emojis

Social Media
Illustration: Diana Quach
Apr 13, 2017 at 2:04 PM ET

For as long as many of its users can remember, the Facebook “Like” was the platform’s sole method of pithy self-expression — an ambiguous, often awkward digital grunt that could mean any number of things depending on the context.

Facebook finally changed that in 2016 when it expanded the humble Like into a wider range of emoji-based reactions: Wow, Angry, Love, Haha, and Sad. But in doing so, the social media giant also expanded its ability to track and profile users based on their emotions and behavior.

That’s what led artist Ben Grosser to create Go Rando, a browser extension that lets you mess with Facebook’s emotion-tracking by automatically randomizing all of your emoji-based post reactions.

“While reactions may help your friends better understand how you feel about a particular post, it also begins to build, over time, a more detailed profile of users’ emotional life on Facebook,” said Grosser during a recent panel at the International Obfuscation Workshop at NYU Law School.

Go Rando disrupts this profile-building by intercepting a user’s clicks and automatically injecting random emotions. LOL’s become anger, love becomes sadness, and so on, making their true feelings indecipherable — not to mention extremely confusing to other users.

Grosser points out that there’s plenty of good reasons users would want to hide their genuine emotions from Facebook. He specifically notes the company’s infamous 2014 “emotional contagion” experiment, in which it secretly manipulated users’ news feeds by injecting positive or negative posts to see if it affected the type of content they posted.

Moreover, he says, the way you behave on Facebook does affect what kind of content you see, and not just ads. The end result of this curation is so-called “filter bubbles,” the algorithmic echo chambers where the types of content and viewpoints you see are limited based on your preferences and past behavior.

For proof, one need look no further than the 2016 election, when these filter bubbles combined with the platform’s penchant for spreading viral hoax news articles left many voters shocked when Hillary Clinton lost to Donald Trump. A British firm called Cambridge Analytica took advantage of this to great effect, bolstering both Trump and the UK’s Brexit campaign by using sentiment analysis and Facebook quizzes to target promoted content to users based on their personality types.

“The Facebook algorithm is not designed to show us the world as it is, it’s designed to show us the world as it thinks we want it to look like,” said Grosser. “The effect of this is that we’re collectively trusting the mechanisms of democracy to algorithms designed to keep us engaged, rather than informed. Facebook’s motivation is profit, not democracy.”

Obviously, there’s a downside to randomized emotional reactions. Unless your Facebook friends understand what you’re doing, letting Go Rando respond a post about your friend’s dead grandpa with “Haha” will probably raise eyebrows.

Facebook also has plenty of other behavioral metrics it can use to infer your true feelings (hmm, like what?) and as several audience members pointed out during Grosser’s panel, it’s probably easier to simply do nothing — avoiding using Facebook’s reactions altogether.

But Grosser said that his goal was less about creating an effective tool and more to provoke conversation about what your emotional life looks like to Facebook’s algorithm, as well as the consequences of letting those responses be tracked over time.

“Not using Facebook entirely would be even more effective,” Grosser responded during the panel. “But 1.7 billion people on Facebook aren’t doing nothing, so I’m focusing on them.”