← All Posts

Facebook's Informed Consent Problem

2014-06-30

PsychologyTechnologyautonomyethicsfacebookresearch
FACEBOOK'S INFORMED CONSENT ISSUEYou've undoubtedly heard by now about Facebook's large scale emotion manipulation study, conducted on their site users. The study, published in the Proceedings of the National Academy of Sciences of the United States of America, found that when Facebook users saw a greater concentration of negative posts in their newsfeed, they were more likely to post negative statuses themselves; the same pattern emerged for positive status updates. [This research probably also partially explains Facebook's insistence on pushing the Top Stories sort on users regardless of their preference; it's the manipulation in a massive social science study. Which doesn't make it any less of a violation of users' sense of autonomy, and thus a poor motivational experience.] The study has its problems, which I'll get into, but the thing that really makes me angry about it is the cavalier attitude it reveals toward informed consent. Informed consent is a requirement of human subjects research. What is means is that if a person is being manipulated in any way, they must give explicit permission to the researcher to be a part of the study. The informed piece is important: It means the study participant must understand what activities will take place as part of the study, so he can make the decision to participate with full knowledge of risks and benefits. This doesn't mean you have to tell the participant your hypotheses or the specific manipulations that will take place, but you do need to make sure he understands what he will experience. In the case of the Facebook study, informed consent might look like "I agree to have the content of my news feed determined by an algorithm that may show some types of stories more often than others." Facebook, of course, did not do this. They claim that informed consent was covered as part of their user agreement. Technically, they're right--the user agreement gives Facebook permission to do all sorts of stuff with user data. [EDIT July 1 2014: A new report suggests that Facebook only updated their TOS to mention user research four months after the completion of this study, which means they no longer have this justification to rest on.] But it violates the principles of informed consent in several ways: Honestly, the manipulation in this study won't cause long-term harm to anyone who was affected. But that's not really the point. The point is that there are professional codes of honor that were deeply violated here. The Belmont Report, a seminal document describing how researchers interact with study participants, specifically highlights "respect for persons" as one of the three cornerstones of ethical research. I certainly don't feel Facebook has shown respect for persons here. As a professional in a field that has a long and ugly history of ethical missteps (start with the Tuskegee Syphilis Study if you're interested), and which has worked hard to put systems in place to avoid repeating such grave errors, I take this kind of behavior from Facebook somewhat personally. [caption id="attachment_467" align="alignright" width="300"]It's like I knew that one day I'd be angry at Facebook! Circa 2006 It's like I knew that one day I'd be angry at Facebook! Circa 2006[/caption] I read another comment from someone who pointed out that companies manipulate consumer experiences all the time in order to change outcomes. This is true. But when Google tests different types of search results or Ebay tweaks product arrays, they do so in the name of boosting their business. They do not purport to be doing research for the peer reviewed literature. They do not pretend to be social scientists. It is wrapping this work in the guise of an academic endeavor while flouting the attendant academic ethics that is problematic. On to the other issues with the research, in case you're curious what popped out to me from the initial reading: It's disappointing to me that this research was accepted for peer-reviewed publication. It sends a disheartening message about the importance of ethics in human subjects research, and it also reinforces that if you're big enough and powerful enough, you can get away with stuff the little guys can't (not that they should want to). It also proves yet again what John Oliver told us about net neutrality: If you want to do something evil, hide it inside something boring. Additional perspectives on the study which I found valuable to read: