New Study Confirms What We’ve Always Known: Facebook Manipulates You
In 2012, researchers at Facebook, Cornell and the University of California at San Francisco took control of some 700,000 users’ news feeds and manipulated the content they saw. Why? They wanted to see if digital content could alter people’s moods.
Not surprisingly, they found that yes, scrolling through Facebook can make you feel good or bad. If you see positive content, you feel positive. And vice versa.
Since the report was published late last week, critics have lashed out, calling it unethical and creepy. They say it was wrong of Facebook to place people into a “manipulation” study without their consent. And they argue studies like these are just the beginning of a slippery slope that could lead to something much darker.
Perhaps all of that is true—it sure is a little unnerving to know that Silicon Valley has a direct pipeline to your emotional state.
But it also doesn’t surprise me in the least. And I think the backlash is silly.
There’s something about this study that has people really on edge. Over at Slate, Katy Waldman theorizes, “Facebook intentionally made thousands upon thousands of people sad.” That’s very possible, but it’s also worth pointing out that this study was conducted over just eight days in January 2012. It wasn’t some sprawling diabolical initiative that lasted for several years and resulted in mass suicide.
“At the end of the day,” explained Adam Kramer, the Facebook data scientist behind the report, “the actual impact on people in the experiment was the minimal amount to statistically detect it—the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.”
Read that last part again. “People produced an average of one fewer emotional word, per thousand words, over the following week.” In all likelihood, the people being studied probably had no idea their feeds were being manipulated, and it probably had very little effect on their lives once the experiment concluded.
Of course, that’s of no solace to the folks who contend that Facebook should have asked the users before enrolling them in the study. “Facebook is our platform,” people seem to be saying. “How dare you manipulate what’s ours without consent?”
To that, I’d respond with this: You’d be foolish to think that Facebook belongs to you, or that the social network, as a publicly traded company, owes anything to you. On Facebook, you are not the user. You are the product. You might not have to pay for the service with money, but you pay for it with your consent for Facebook to use your data as it pleases.
After all, that’s the company’s business: data. Just like Google and Twitter, Facebook is constantly using your (self-produced) data to optimize engagements on the site and—ding, ding!—sell more ads. From Facebook’s terms of service: “You permit a business or other entity to pay us to display your name and/or profile picture with your content or information, without any compensation to you.”
So it’s all manipulation, really, and it always has been. It’s well-known (and a source of litigation) that Facebook uses your private messages to target specific products to you. Now that Facebook knows it can control your mood, perhaps—as the conspiracy goes—the network will begin to load up your feed with sad content, and then shoot you a bunch of advertisements for “happy” products, like vacations or spa treatments. Who knows?
But before you get worked up about a mood study from 2012 and delete your account and post an angry rant on Facebook, remember that Facebook is manipulating you every day—using the content you choose to upload.
And you probably don’t even realize it.