“Experimental evidence of massive-scale emotional contagion through social networks” was published in the June 17, 2014 Proceedings of the National Academy of Sciences of the United States of America. A team from Facebook, U.C. San Francisco, and Cornell University conducted an experimental study during one week in 2012 to test whether emotions on Facebook are contagious. Their finding: yes.*
Let’s pick some of that apart, shall we?
Experimental: (as opposed to observational) This means the researchers set up controls and variables and deliberately changed the environment to observe a reaction under rigorous(ish) conditions.
Emotional Contagion: This is nothing new. The idea that emotions can spread in a group has been around. What the study wanted to know was if you see positive posts on Facebook, are you more likely to post positive things and vice versa.
*But not really. John M. Grohol, Psy. D. of PsychCentral tears their methodology, results, and presentation of the effect more or less to shreds.
Bottom line: Facebook deliberately screwed around with the content in the News Feeds (like they do) of just shy of 700K people for the express purpose of seeing if they got happier or crankier depending on the emotional content of what they were seeing, but they did it so badly that their results can’t really be seen as useful.
The problems with this study are profound (David Gorski, MD, PhD, FACS of Science-Based Medicine picks the issue apart very thoroughly), but the one that’s receiving the most attention is whether or not this was (a) legal or (b) ethical.
Because of lovely gems like the Stanley Milgram experiments and, of course, research conducted by Nazi scientists on prisoners during World War II, there are many laws and protocols and ethical guidelines in place to product subjects in any research that uses humans as subjects. These are primarily governed by the Federal Common Rule, under which review by an Institutional Review Board (IRB) is strictly regulated and absolutely mandatory for:
- Private organizations that have received grant funding (which includes Cornell, UCSF, and possibly the study itself, though the back and forth on that has been muddy)
- Drug companies that would like FDA approval for their treatments
Sadly, private companies doing privately funded research whose end goal does not need to receive a government rubber stamp for further funding or sale / implementing of treatment can basically thumb their noses at the Common Rule guidelines, which means Facebook is, legally speaking, probably in the clear. To the best of my understanding, however, because Cornell and UCSF faculty were involved in the research, the Cornell and UCSF IRB’s should have reviewed this process, and the closest either seems to have come is an approval from Cornell based on a “pre-existing” set of data received from Facebook…which might actually be sufficient to keep them in the clear if they were not actually involved in the experimental design or the data collection, but only the analysis.
Bottom line: The study might actually have been technically legal. Maybe.
When the question of whether a study was even legal takes the degree of debate this study has taken, it’s probably safe to say that the study is not going to meet the ethical guidelines, but let’s be fair, and let’s stay focused on informed consent. Let’s pretend that someone at Facebook gave a tenth of a fig and took the time to ponder this:
IRB Latitude to Approve a Consent Procedure that Alters or Waives some or all of the Elements of Consent
§ 46.116 – An IRB may approve a consent procedure, which does not include, or which alters, some or all of the elements of informed consent set forth in this section, or waive the requirements to obtain informed consent, provided the IRB finds and documents that:
- C: 1.The research or demonstration project is to be conducted by, or subject to the approval of, state or local government officials, and is designed to study, evaluate, or otherwise examine: (i) public benefit or service programs; (ii) procedures for obtaining benefits or services under those programs; (iii) possible changes in or alternatives to those programs or procedures; or (iv) possible changes in methods or levels of payment for benefits or services under those programs; and
- C: 2.The research could not practicably be carried out without the waiver or alteration.
- D: 1. The research involves no more than minimal risk to the subjects;
- D: 2.The waiver or alteration will not adversely affect the rights and welfare of the subjects;
- D: 3.The research could not practicably be carried out without the waiver or alteration; and
- D: 4.Whenever appropriate, the subjects will be provided with additional pertinent information after participation.
Focusing on D: 1-4, of which ALL must be met, we can say that the risk involved was probably minimal, sure. It’s also possible that one could argue that the rights and welfare of the subjects isn’t adversely affected, though there’s a strong case for the reverse as well as the fact of using a random population means they can’t know whose moods they’re manipulating and what impact that manipulation might ripple out to have on their lives. Regarding the third point: having written HSR consent forms before, I would argue that there is absolutely no reason that obtaining active consent would have hurt the methodology of this study. And finally, the subjects were not (as far as I can tell) informed after the experiment was done, as they should have been.
Even if the study had met the three out of four requirements for waiving consent that it failed to meet, it’s important not to overlook that this falls under the context of what an IRB is allowed to approve. The entire point of an IRB is to have educated individuals who have a background in research to give an unbiased analysis of the ethics of a study. Someone to force the point of obtaining consent when the researchers can’t be bothered. No one held that position of unbiased analysis in this case.
Bottom line: This study belongs very soundly in the category of pointless ethical fail.
The Bottom Bottom Line
Facebook used a pretty damn sneaky legal loophole to pointlessly skirt the requirement to obtain consent for a study whose methodology was all the French fries short of a Happy Meal for the purpose of understanding how to keep people happy in order to keep people coming back so they could sell them stuff. By doing this, they managed to piss a lot of people off while learning very little of reliable use.
That, my friends, is what irony looks like.