• Home
  • Latest
  • Fortune 500
  • Finance
  • Tech
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
Tech

Facebook ‘filter bubble’ study raises more questions than it answers

By
Mathew Ingram
Mathew Ingram
Down Arrow Button Icon
By
Mathew Ingram
Mathew Ingram
Down Arrow Button Icon
May 7, 2015, 10:10 PM ET

Facebook has been criticized for some time for its role in creating a “filter bubble” among its users, a term that comes from a book of the same name by Eli Pariser (who went on to help create viral-content site Upworthy). Critics say the social network does this by shaping our perception of the world with its algorithmically-filtered newsfeed. Facebook, however, has come out with a study that it says proves this isn’t true—if there is a filter bubble, the company says, it exists because users choose to see certain things, not because of Facebook’s algorithmic filters.

But is this really what the study proves? There’s considerable debate about that among social scientists knowledgeable in the field, who note that the conclusions Facebook wants us to draw—by saying, for example, that the study “establishes that … individual choices matter more than algorithms”—aren’t necessarily supported by the evidence actually provided in the paper.

For one thing, these researchers point out that the study only looked at a tiny fraction of the total Facebook user population: less than 4% of the overall user base, in fact (a number which doesn’t appear in the study itself but is only mentioned in an appendix). That’s because the study group was selected only from those users who specifically mention their political affiliation. Needless to say, extrapolating from that to the entire 1.2 billion-user Facebook universe is a huge leap.

Sociologist Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than algorithms, it doesn’t actually back this up. In fact, while that appears to be the case for conservative users, in the case of users who identified themselves as liberals, Facebook’s own data shows that exposure to different ideological views is reduced more by the algorithm (8%) than it is by a user’s personal choice.

But even that’s not the biggest problem, Jurgenson and others say. The biggest issue is that the Facebook study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so. The study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is ultimately based on:

“Individual users choosing news they agree with and Facebook’s algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend… what’s important is the finding that [the newsfeed] algorithm exacerbates and furthers this filter bubble.”

Sociologist and social-media expert Zeynep Tufekci points out in a post on Medium that trying to separate and compare these two things represents the worst “apples to oranges comparison I’ve seen recently,” since the two things that Facebook is pretending are unrelated have significant cumulative effects, and in fact are tied directly to each other. In other words, Facebook’s algorithmic filter magnifies the already human tendency to avoid news or opinions that we don’t agree with.

“Comparing the individual choice to algorithmic suppression is like asking about the amount of trans fatty acids in french fries, a newly-added ingredient to the menu, and being told that hamburgers, which have long been on the menu, also have trans-fatty acids.”

Christian Sandvig, an associate professor at the University of Michigan, calls the Facebook research the “not our fault” study, since it is clearly designed to absolve the social network of blame for people not being exposed to contrary news and opinion. In addition to the framing of the research — which tries to claim that being exposed to differing opinions isn’t necessarily a positive thing for society — the conclusion that user choice is the big problem just doesn’t ring true, says Sandvig (who has written a paper about the biased nature of Facebook’s algorithm).

“The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking…. there is no scenario in which user choices vs. the algorithm can be traded off, because they happen together. Users select from what the algorithm already filtered for them. It is a sequence.”

Jurgenson also talks about this, and about how Facebook’s attempt to argue that its algorithm is somehow unbiased or neutral ­— and that the big problem is what users decide to click on and share — is disingenuous. The whole reason why some (including Tufekci, who has written about this before) are so concerned about algorithmic filtering is that users’ behavior is ultimately determined by that filtering. The two processes are symbiotic, so arguing that one is worse than the other makes no sense.

In other words, not only does the study not actually prove what it claims to prove, but the argument that the site is making in defense of its algorithm also isn’t supported by the facts—and in fact, can’t actually be proven by the study as it currently exists. And as Eli Pariser points out in his piece on Medium about the research, the study also can’t be reproduced (a crucial element of any scientific research) because the only people who are allowed access to the necessary data are researchers who work for Facebook.

Correction, May 8, 2015: An earlier version of this post misidentified Christian Sandvig. He is an associate professor at the University of Michigan.

About the Author
By Mathew Ingram
See full bioRight Arrow Button Icon

Latest in Tech

Jeff Williams, former Apple CEO
C-SuiteDisney
Jeff Williams, who retired from Apple after 27 years less than a month ago, just got called up by Disney to join its board of directors
By Dave SmithDecember 10, 2025
11 minutes ago
AIBrainstorm AI
Young people are ‘growing up fluent in AI’ and that’s helping them stand apart from their older peers, says Gen Z founder Kiara Nirghin
By Angelica AngDecember 10, 2025
2 hours ago
RetailGrocery
Instacart may be jacking up your grocery prices using AI, study shows—a practice called ‘smart rounding’
By Dave Lozo and Morning BrewDecember 10, 2025
2 hours ago
C-SuiteLeadership Next
Circle CEO Jeremy Allaire worked his way up from selling baseball cards as a kid to having one of the most influential IPOs of the year
By Fortune EditorsDecember 10, 2025
3 hours ago
Luigi
CybersecurityCrime
Pluck eyebrows. Avoid surveillance cameras: Luigi Mangione’s to-do list as he tried to avoid arrest revealed in court
By Michael R. Sisak and The Associated PressDecember 10, 2025
3 hours ago
Four men pose for photo
CryptoCryptocurrency
Exclusive: Surf, an AI platform just for crypto, raises $15 million
By Carlos GarciaDecember 10, 2025
4 hours ago

Most Popular

placeholder alt text
Politics
Exclusive: U.S. businesses are getting throttled by the drop in tourism from Canada: 'I can count the number of Canadian visitors on one hand'
By Dave SmithDecember 10, 2025
8 hours ago
placeholder alt text
Economy
‘Fodder for a recession’: Top economist Mark Zandi warns about so many Americans ‘already living on the financial edge’ in a K-shaped economy 
By Eva RoytburgDecember 9, 2025
22 hours ago
placeholder alt text
Banking
Jamie Dimon taps Jeff Bezos, Michael Dell, and Ford CEO Jim Farley to advise JPMorgan's $1.5 trillion national security initiative
By Nino PaoliDecember 9, 2025
24 hours ago
placeholder alt text
Uncategorized
Transforming customer support through intelligent AI operations
By Lauren ChomiukNovember 26, 2025
14 days ago
placeholder alt text
Economy
The 'forever layoffs' era hits a recession trigger as corporates sack 1.1 million workers through November
By Nick Lichtenberg and Eva RoytburgDecember 9, 2025
1 day ago
placeholder alt text
Success
Even the man behind ChatGPT, OpenAI CEO Sam Altman, is worried about the ‘rate of change that’s happening in the world right now’ thanks to AI
By Preston ForeDecember 9, 2025
1 day ago
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.