Facebook manipulated over 600K users for psych experiment

645X363 - No Companion - Full Sharing - Additional videos are suggested - Policy/Regulation/Blogs

Facebook manipulated users’ emotions for an academic study involving more than 600,000 accounts, sparking new concerns over the tech giant's privacy policies.

The resulting paper, written by a team including Facebook data scientist Adam Kramer, focused on whether seeing positive or negative content on a user’s Facebook homepage would alter that user’s mood.


The study changed newsfeed content for 689,003 users to show less positive or less negative content and then monitored the tone of each user’s Facebook posts to see how they reacted.

“We show, via a massive ... experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the authors wrote about their study.

The paper stirred controversy over the weekend, as commenters raised questions about whether Facebook had actual consent from users — who agreed to the site’s standard Terms of Service policy — to participate in such an experiment and whether the study was properly vetted by research authorities.

In a Facebook post on Sunday, Kramer defended the study but noted the backlash.

“The goal of all of our research at Facebook is to learn how to provide a better service,” he wrote.

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.”

Kramer said he understands and apologized for “any anxiety [the paper] caused.”

“In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he wrote.

A Facebook spokesman told Forbes that the company does “research to improve our services and to make the content people see on Facebook as relevant and engaging as possible,” including “understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.”

The official also defended the company’s “strong internal review process” and privacy practices, saying that none of the data used in the study was linked to a specific Facebook user.

“There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely,” the official said.