Facebook is facing mounting criticism from users who are angered by the ethics of an experiment conducted by the social media giant and several universities that manipulated users’ news feeds to see the effect it had on their overall mood.
The study, published this month in the Proceedings of the National Academy of Sciences, involved a Facebook data scientist and two university researchers who over one week in 2012 turned 689,003 unsuspecting users’ News Feeds either positive or negative to see how it would affect their mood.
The goal was to find out if emotions are “contagious” on social networks and, according to the published study, they are. However many users are angered by the researchers’ actions as they did not inform users that they were manipulating their News Feeds.
According to Forbes, Facebook’s Data Use Policy legally allows the company to use people’s data for research, but there still remains some contention around the ethics of manipulating that data and consequently people’s emotions without their consent.
“None of the data used was associated with a specific person’s Facebook account,” a Facebook spokesperson told Forbes. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.”
Social media has since exploded with angry messages from users concerned about Facebook’s actions. Susan Fiske, the Princeton University psychology professor who edited the study for publication called the experiment “an open ethical question”.
“It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. There’s not an absolute answer,” she told The Atlantic. “And so the level of outrage that appears to be happening suggests that maybe it shouldn’t have been done…I’m still thinking about it and I’m a little creeped out, too.”
Facebook data scientist Adam Kramer, who helped run the study, commented on the backlash through a post on his Facebook page. He said that the reason he and his co-researchers undertook the study was to make Facebook better, but that their motivations were not clearly outlined in the paper.
“We care about the emotional impact of Facebook and the people that use our product,” he writes. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
“In hindsight,” he added, “the research benefits of the paper may not have justified all of this anxiety.” He said Facebook is working on improving its internal review practices for approving experiments and that it will “incorporate what we’ve learned from the reaction to this paper.”