We all know that Facebook is this huge, creepy entity that has dubious clauses laced into it’s user agreement. In creating an account and checking the “I Accept These Terms” box, users have effectively given up their rights to their own privacy. But like I said, this is not new news to us. We go on Facebook to get our daily fix of last night’s party pics and gratuitous selfies and TMI status updates regardless of the fact that Mark Zuckerberg is most likely trying to run some shady shit in the background. But maybe we should reconsider?
News has broke that Facebook unknowingly turned you into a guinea pig for their first (latest?) mood experiment. According to the New York Times, “Facebook revealed that it had manipulated the news feeds of over half a million randomly selected users to change the number of positive and negative posts they saw. It was part of a psychological study to examine how emotions can be spread on social media.” Users who participated — and I use that word so very loosely, since no one actually knew they were participating — were shown an altered newsfeed of only emotionally positive or negative posts.
So this means, for example, if your newsfeed was selected to show only “negative posts” you’d be solely presented with photos, links and statuses from friends that were deemed by Facebook’s algorithms to be negative in nature. The photos of happy babies, girls weekends and engagement rings were nowhere in sight for those users. The theory being tested was that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” That’s from the official study report, in case you want to roll up your sleeves and read that for fun.
In case you were wondering, users did in fact uphold the theory by posting content in the vein of what they were being shown.
And if any of this sounds sketchy to you, don’t worry. The Times said, “Although academic protocols generally call for getting people’s consent before psychological research is conducted on them, Facebook didn’t ask for explicit permission from those it selected for the experiment. It argued that its 1.28 billion monthly users gave blanket consent to the company’s research as a condition of using the service.”
So while participants were not made aware that they were participating and were never event sent notification after the study had closed, it’s all perfectly legal because all current Facebook users pre-agreed to participate in any study conducted by Facebook. Unsurprisingly, users are not happy about this one bit.
Because of the backlash, the leader of the Facebook study, Adam Kramer, went so far as to post an apology for the study on his personal Facebook wall:
OK so. A lot of people have asked me about my and Jamie and Jeff‘s recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.
Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.
And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.
The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.
While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.
As if you needed another reminder to be careful what you post online, be mindful of what you agree, this is it. Always read the fine print. And for the love of god, your news feed should not dictate your happiness. You gotta do you.
[Lead image via]