Mark Zuckerberg, CEO of Facebook.
Photograph by David Paul Morris — Bloomberg/Getty Images

A Facebook study rebuts criticism that it fills its news feed with political posts that users are likely to agree with.

By Kia Kokalitcheva
May 7, 2015

When looking at your Facebook news feed, do you feel that your friends’ posts about politics align with your own beliefs and that you agree with every article?

If so, then it’s all in your head, according to a report by the social networking giant Thursday. In fact, views are typically quite varied, the study found, helping to rebut complaints that the service – with the help of its shadowy algorithm that decides how prominently to display individual posts – has become an echo chamber for like-minded people.

Critics have long blamed Facebook and other social media companies for creating a more politically polarized climate. The thinking is that such services automatically filter out viewpoints that people don’t agree with, as a sort of business strategy.

But with the research, published Thursday in Science, Facebook is pushing back. The peer-reviewed journal gives the findings extra legitimacy that a typical self-serving press release couldn’t.

“We found that most people have friends who claim an opposing political ideology, and that the content in peoples’ News Feeds reflect those diverse views,” the company wrote in a blog post. “News Feed surfaces content that is slightly more aligned with an individual’s own ideology, however the friends you choose and the content you click on are more important factors than News Feed ranking in terms of how much content you encounter that cuts across ideological lines.”

In its research, Facebook looked at 10.1 million of its U.S. users for insight into how similar their political views were to their friends, how likely they were to come across diverse views in their news feed, and their interactions with content aligning with and opposing their views.

In a blog discussing its findings, Facebook pointed to two interesting phenomena. First, it found in 2012 that much of the information users were exposed to on its service and engaged with comes from friends with whom they have relatively loose ties. The explanation is that, yes, users are more likely to share content from people they’re close to and who have similar outlooks. But social networks are typically so large – filled with people who are mere acquaintances – that most content surfaced in the news feed is inevitably diverse.

Second, recent research has shown that the power of endorsement from people in our social network can trump political ideology. This means that, for example, users are more likely to click on a news article supporting an opposing political view that a friend has shared than one supporting their own ideology from a source with whom they have no social connection.

Overall, Facebook has found the following:

  • On average 23% of people’s friends claim an opposing political ideology
  • Of the hard news content that people’s friends share, 29.5% of it cuts across ideological lines
  • When it comes to what people see in News Feed, 28.9% of the hard news encountered cuts across ideological lines, on average
  • 24.9% of the hard news content that people actually clicked on cuts across ideological lines

While Facebook’s data teams constantly investigate a variety of questions about users’ behaviors and how the social network impacts them, the question of the echo chamber could be of particular important as Facebook’s interest in being a news source continues to increase.

For more about Facebook, watch this Fortune video:

SPONSORED FINANCIAL CONTENT

You May Like