A Facebook (FB) data scientist who was involved in the social network’s 2012 emotional experiment apologized on his public profile page and expressed doubt over the project’s necessity.
“My coauthors and I are very sorry for the way the paper described the research and any anxiety it caused,” wrote Adam Kramer. “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
The post was a response to the public outcry over an experiment Facebook ran 2012 on almost 700,000 unknowing users to determine whether it could affect their feelings by showing either more positive or more negative content.
The site’s data scientists set up an algorithm for a week that would show posts with only positive or negative emotions based on words associated with each. The results were published in the March issue of the Proceedings of the National Academy of Sciences.
The news that Facebook used its algorithmic powers to play with users’ emotions sparked outrage among some Facebook participants and the wider media. In one blog post on Animalnewyork.com, Sophie Weiner accuses Facebook of treating users like common “lab rats.”
Kramer, one of the three data scientists that oversaw the experiment, took to Facebook to clarify the motivations behind the emotional baiting.
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” said Kramer. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
Kramer, who apologized after explaining the researchers’ reasoning, found that the actual impact of emotional content was minimal: people typically posted an average of one fewer emotional word, per thousand words, over the following week.
This study is only one of many run by Facebook’s team of data scientists that analyze the reams of data generated by the site’s 1.28 billion users worldwide. Mark Zuckerberg even implemented his own organ donor experiment in 2012.
Such experimentation is legal. By signing up for the service, users agree to provide their data and profiles to Facebook through the terms of service. While some of that information is already used to target advertisements, the remaining portion that fills Facebook’s servers is not accessible to marketers but is instead used by the company for research.
While the study was conducted anonymously and Facebook didn’t break any privacy violations, there’s been popular outcry that the company acted unethically by manipulating people.
“Manipulating unknowing users’ emotional states to get there puts Facebook’s big toe on that creepy line,” wrote Kashmir Hill in a blog post response on Forbes.com. “When universities conduct studies on people, they have to run them by an ethics board first to get approval.”
Kramer said that Facebook has since been working on improving its “internal review practices,” though he doesn’t say whether or not that will include improving standards around informing users about experiments affecting their emotions.