The Great Facebook Experiment Train Wreck

 

Last month, a publication of the National Academy of Sciences published the results of an experiment conducted by Facebook on 689,003 users that involved controlling user news feed to either include or exclude positive information.  As summarized by Forbes’ Kashmir Hill , “[i]f there was a week in January 2012 where you were only seeing photos of dead dogs or incredibly cute babies, you may have been part of the study.”

The study’s findings were summarized as follows:

When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.Once the report filtered outside academic journals, a firestorm erupted revealing that in almost every step of the process of the study and in its damage control operations Facebook made the wrong choice.

(1) The Creepy Factor

At a 2008 OMMA Behavioral Targeting Conference, then Tacoda CEO Dave Morgan warned the industry that just because something is permitted does not mean it is wise.  Referring specifically to some of the emerging companies using Deep Packet Inspection technology, he bluntly stated, “I don’t think it passes the creepy factor.”

Facebook’s first mistake was proceeding with the study without a Creepy Alarm going off since the mere suggestion of experimentation on unwitting or involuntary human subjects often evokes discussions of some of the more infamous instances of experimentation ranging from Nazi Germany to the Tuskegee Syphilis Study.  For that reason there are rules and ethical standards in place that govern such research.

For example, the World Medical Association Declaration on the Ethical Principles for Medical Research Involving Human Subjects provides that

each potential subject must be adequately informed of the aims, methods, sources of funding, any possible conflicts of interest, institutional affiliations of the researcher, the anticipated benefits and potential risks of the study and the discomfort it may entail, post-study provisions and any other relevant aspects of the study. The potential subject must be informed of the right to refuse to participate in the study or to withdraw consent to participate at any time without reprisal.

Research receiving federal funds are subject to regulations that

no investigator may involve a human being as a subject in research covered by this policy unless the investigator has obtained the legally effective informed consent of the subject or the subject’s legally authorized representative. An investigator shall seek such consent only under circumstances that provide the prospective subject or the representative sufficient opportunity to consider whether or not to participate and that minimize the possibility of coercion or undue influence. The information that is given to the subject or the representative shall be in language understandable to the subject or the representative

While Facebook is not necessarily subject to these requirements, these are the prevailing standards in the field.

In addition, consent alone does not eliminate creepiness since the infamous Stanford prisoner experiment involved volunteers.

(2)  Misrepresenting Institutional Clearance

These types of studies often are subject to an independent ethical review by what is known as Institutional Review Board. Facebook apparently suggested that such a review was conducted, but in this case it was only an internal Facebook review.

 (3) Implied Informed Consent

Facebook contends the research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”  The clause in question is as follows:

For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

There are two problems with this contention:

(i) relying on the single word “research” in a clause buried in the Data Use Policy hardly constitutes “express consent” as is contemplated for such arrangements; and

(ii) more significantly, the clause in question was added after the study was completed.

(4) Implied Informed Consent, Um, I Mean No Consent . . . So What’s The Big Deal?

Tal Yarkoni, the director of the Pyschinformatics Lab at the University of Texas argues:

The reality is that Facebook–and virtually every other large company with a major web presence–is constantly conducting large controlled experiments on user behavior. Data scientists and user experience researchers at Facebook, Twitter, Google, etc. routinely run dozens, hundreds, or thousands of experiments a day, all of which involve random assignment of users to different conditions. Typically, these manipulations aren’t conducted in order to test basic questions about emotional contagion; they’re conducted with the explicit goal of helping to increase revenue.. . . [I]t’s worth keeping in mind that there’s nothing intrinsically evil about the idea that large corporations might be trying to manipulate your experience and behavior. Everybody you interact with–including every one of your friends, family, and colleagues–is constantly trying to manipulate your behavior in various ways.

But, as Jaron Lanier explains in the New York Times:

The manipulation of emotion is no small thing. An estimated 60 percent of suicides are preceded by a mood disorder. Even mild depression has been shown to increase the risk of heart failure by 5 percent; moderate to severe depression increases it by 40 percent.

Then there is the Federal Trade Commission,  In 2011, Facebook entered into a consent decree with the FTC that, among other things, provided that Facebook was:

  • barred from making misrepresentations about the privacy or security of consumers’ personal information;
  • required to obtain consumers’ affirmative express consent before enacting changes that override their privacy preferences;
  • required to establish and maintain a comprehensive privacy program designed to address privacy risks associated with the development and management of new and existing products and services, and to protect the privacy and confidentiality of consumers’ information.

There is no word yet from the Federal Trade Commission on the study, but officials in the UK and Ireland are said to be looking into the matter.  One significant factor is that the study did not exclude minors.

(5)  Our Experiment Designed to Upset You Was Not Intended to Upset You

sampfd78c360cf558bb1Rule number one of crisis management is not to dig any deeper.  Facebook has done just that by not grasping how big this issue has become.  Facebook COO Sheryl Sandberg offered a weak apology to the Wall Street Journal saying the purpose of the project was “poorly communicated” and “for that communication we apologize. We never meant to upset you.”

Actually, Sheryl, you did – that was the whole purpose of the study.  That is precisely the type of non-apology that only makes matters worse and earns you an entry on SorryWatch.

 

More Information: Kashmir Hill, Forbes: (1) Facebook Manipulated 689,003 Users’ Emotions For Science; (2) Sheryl Sandberg Apologizes for Emotion Manipulation Study – Kind Of; (3) Facebook Added ‘Research’ To User Agreement 4 Months After Emotion Manipulation Study; and (4) Facebook Doesn’t Understand The Fuss About Its Emotion Manipulation Study

Facebook Consent Decree, FTC; Tal Yarkoni, In Defense of FacebookEven the Editor of Facebook’s Moon Study Thought it Was Creepy, The Atlantic;Facebook Faces UK Probe of Emotion Study, BBC; How to Communicate in a Crisis, Inc. Magazine; Top 10 Evil Human Experiments, List Verse; Calling for an Apology Cease Fire, New York Times; Jaron Lanier – On The Lack of Transparency in Facebook Study, New York Times; Why the Everybody Does it Defense of Facebook’s Emotional Manipulation Experiment is Bogus, Salon; Did Facebook and PNAS Violate Human Research Protections in Unethical Experiment, Science Based Medicine; Facebook’s Unethical Experiment, Slate; Facebook Study Sparks Ethical Questions, Wall Street Journal