Facepalm: Facebook experiments on its users, Part I

 

As Popular Mechanics tells it, at Facebook, there is a motto written on the wall, “Move fast and break things.”

It sounds like the usual motivational blather, so no one realized that they meant every word, especially “fast” and “break.” Until now.
In January 2012, a Facebook data scientist, working with experimenters from Cornell University and the University of California over one week, San Francisco, manipulated the feeds of 689,003 English language users (1 in 2500 users). The firm made changes in what users saw, to study whether they could change the “mood” of such a large group by manipulating which posts from friends these users would see. Back of their quest is social science lore around “emotional contagion.” Then the experimenters published a study of the findings in the Proceedings of the National Academy of Sciences (PNAS). Here's the study (open access—that is, free to download).
Then it hit the fan.
The background, as the authors explain, is that because “ people’s friends frequently produce much more content than one person can view,” Facebook’s practice is to filter the content of users’ News Feed “via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging.”
So, as bioethicist Michelle N. Meyer recounts, the filter was manipulated as follows:

A post was identified as “positive” or “negative” if it used at least one word identified as positive or negative by software (run automatically without researchers accessing users’ text).
To do so, the researchers conducted two experiments, with a total of four groups of users (about 155,000 each). In the first experiment, Facebook reduced the positive content of News Feeds. Each positive post “had between a 10-percent and 90-percent chance (based on their User ID) of being omitted from their News Feed for that specific viewing.” In the second experiment, Facebook reduced the negative content of News Feeds in the same manner. In both experiments, these treatment conditions were compared with control conditions in which a similar portion of posts were randomly filtered out (i.e., without regard to emotional content). Note that whatever negativity users were exposed to came from their own friends, not, somehow, from Facebook engineers. In the first, presumably most objectionable, experiment, the researchers chose to filter out varying amounts (10 percent to 90 percent) of friends’ positive content, thereby leaving a News Feed more concentrated with posts in which a user’s friend had written at least one negative word.
If all this sounds confusing, the bottom line is that they were screening news to mess with users’ perceptions of reality, for an experiment where the users did not know they were the subjects, one that would later be written up in a top science journal.
Consent? Well, legally, if you consented to their terms of use, you consented to one little word, “research.” But did you know what that meant? University of Western Australia innovation researcher David Glance notes,
Although the experiment may not have breached any of Facebook’s user agreements, it is clear that informed consent was not obtained from the participants of the research.
Facebook’s Data Use Policy only says that it has the right to use information it receives for research. It does not make explicit that this involves actually carrying out experiments designed to manipulate emotions in their customers, especially not negative ones.
No indeed. We routinely sign boilerplate, assuming that it addresses exotic disputes we are never likely to be involved in. No one familiar with the meaning of the term “informed consent” in experiments on human subjects thinks that assent to Facebook’s generalities about terms of use is informed consent for unwitting participation in a science study. In the matters such boilerplate typically addresses, many of us reason that, were something egregious to happen, the Crown prosecutor or a wealthy public interest litigant would step in.
As it turns out, dream on.
Glance points out,
The fact that the researchers and Facebook did not ask for consent suggests that they knew that there would be a backlash when it became public and that it would be easier to deal with this after the fact.
Indeed. The key victory for Facebook is to obtain the presumptive right to conduct such experiments without meaningful consent on the part of the participants—by going ahead and doing it, and later weathering the storm successfully. This is the new world of Big Data.
The Facebook response pretty much confirms this view: Adam Kramer, Facebook data scientist and first author of the PNAS study, offers on behalf of his employer, “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.”
We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.
Kramer, if you care so much about me, why don’t you tell me what you are doing to me, allegedly for my own good?
I am a human Facebook user, not an animal at the vet’s. I would understand what you were telling me. I could also have said no, leave me out. I have my own emotional issues just now.
He goes on, pricelessly, to say that “our goal was never to upset anyone.”
No indeed, their goal was more likely to create a situation where no one would be in a position to do anything if they were upset. He also claims that Facebook has “improved” internal review practice since then, obliquely admitting that it was a problem after all.
So what did the researchers find, and what was the reaction? And is it really no different from advertisers’ “messing with our minds”? Yes it is; see Facepalm: Facebook experiments on its users, Part II: Note: Facepalm.  

Denyse O’Leary is a Canadian journalist, author, and blogger.

icon

Join Mercator today for free and get our latest news and analysis

Buck internet censorship and get the news you may not get anywhere else, delivered right to your inbox. It's free and your info is safe with us, we will never share or sell your personal data.

Be the first to comment

Please check your e-mail for a link to activate your account.