Privacy group files FTC complaint over Facebook's 'emoti Source: Nancy Weil
Facebook “purposefully messed with people’s minds” in a “secretive and non-consensual” study on nearly 700,000 users whose emotions were intentionally manipulated when the company altered their news feeds for research purposes, a digital privacy rights group charges in a complaint filed with the U.S. Federal Trade Commission.
The Electronic Privacy Information Center filed the complaint Thursday, asking the FTC to impose sanctions on Facebook. The study violated terms of a 20-year consent decree that requires the social-networking company must protect its users’ privacy, EPIC said. EPIC also wants Facebook to be forced to disclose the algorithms it uses to determine what appears in users’ news feeds.
The complaint follows days of mounting outrage from privacy-rights advocates and Facebook users―some of whom are quoted in the EPIC complaint―after results of the study, published June 2 by the Proceedings of the National Academy of Sciences, became widely known. Researchers from Facebook, the University of California, San Francisco and Cornell University conducted the study from Jan. 11 to Jan. 18, 2012, on 689,003 English-speaking Facebook users. The study was, however, for Facebook’s internal purposes.
The research sought to show whether emotions can be influenced with no face-to-face contact by altering Facebook’s algorithm to show mostly positive or negative posts. Scientists call that “emotional contagion.” The study found that people whose news feeds contained more positive comments tended to make more positive comments and those who took in more negative posts were more bummed out in their own posts.
When news of the study emerged, researchers and eventually Facebook COO Sheryl Sandberg said that the research wasn’t explained well by the company, though the apologies struck many as well short of an actual mea culpa.
PNAS editor-in-chief Inder M. Verma published an “editorial expression of concern” on Wednesday regarding the study, saying that researchers contend it was consistent with Facebook’s data use policy so that when users sign up, they are giving “informed consent” that their data might be used in research. Because the research was conducted internally by Facebook, it did not fall under the auspices of Cornell’s Human Research Protection Program, the statement says.
The statement further notes that as a private company Facebook is under no obligation to follow what is known as the “common rule” among researchers, to obtain informed consent from study participants and allow them to opt out if they don’t want to be part of research.
“Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper,” Verma wrote. “It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.”
Apart from the privacy implications and questionable ethics of social-media experimentation on users who had no idea they were part of a study, Facebook also altered its data use policy in May 2012, four months after the study was conducted. The policy change specifically said that user data might be used for research and could be shared with researchers, a major point of contention in the EPIC complaint.
Regulators in the U.K. and Ireland are also investigating the study.
Facebook issued a statement via email when asked for comment about EPIC’s complaint with the FTC, saying that the company isn’t commenting about the complaint specifically.
When someone signs up for Facebook, weve always asked permission to use their information to provide and enhance the services we offer,” the company also said via email. “To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether their privacy policy uses the word research’ or not.
| }
|