To Facebook, We Are All LAB RATS
Facebook routinely adjusts its users’ news feeds — testing out the number of ads they see or the size of photos that appear — often without their knowledge. It is all for the purpose, the company says, of creating a more alluring and useful product.
But last week, Facebook revealed that it had manipulated the news feeds of over half a million randomly selected users to change the number of positive and negative posts they saw. It was part of a psychological study to examine how emotions can be spread on social media.
The company says users consent to this kind of manipulation when they agree to its Facebook Terms of Service (click the link to see how much you give away!). But in the quick judgment of the Internet, that argument was not universally accepted.
“I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible,” the privacy activist Lauren Weinstein wrote in a Twitter post.
On Sunday afternoon, the Facebook researcher who led the study, Adam D. I. Kramer (Data Scientist), posted a public apology on his Facebook page.
“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he wrote.
Facebook is hardly the only Internet company that manipulates and analyzes consumer data. Google and Yahoo also watch how users interact with search results or news articles to adjust what is shown; they say this improves the user experience. But Facebook’s most recent test did not appear to have such a beneficial purpose.
“Facebook didn’t do anything illegal, but they didn’t do right by their customers,” said Brian Blau, a technology analyst with Gartner, a research firm. “Doing psychological testing on people crosses the line.”
In an academic paper published in conjunction with two university researchers, the company reported that, for one week in January 2012, it had altered the number of positive and negative posts in the news feeds of 689,003 randomly selected users to see what effect the changes had on the tone of the posts the recipients then wrote.
The researchers found that moods were contagious. The people who saw more positive posts responded by writing more positive posts. Similarly, seeing more negative content prompted the viewers to be more negative in their own posts.
Although academic protocols generally call for getting people’s consent before psychological research is conducted on them, Facebook didn’t ask for explicit permission from those it selected for the experiment. It argued that its 1.28 billion monthly users gave blanket consent to the company’s research as a condition of using the service.
But the social network’s manipulation of its users’ feelings without their knowledge stirred up its own negative reaction. Some Facebook users and critics suggested that the company had crossed an ethical boundary.
Read & Download the Research
from this Previous Post: Experimental Evidence of Massive Scale Emotional Contagion Through Social Networks Research