• Home
  • Latest
  • Fortune 500
  • Finance
  • Tech
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
Tech

Facebook ‘filter bubble’ study raises more questions than it answers

By
Mathew Ingram
Mathew Ingram
Down Arrow Button Icon
By
Mathew Ingram
Mathew Ingram
Down Arrow Button Icon
May 7, 2015, 10:10 PM ET

Facebook has been criticized for some time for its role in creating a “filter bubble” among its users, a term that comes from a book of the same name by Eli Pariser (who went on to help create viral-content site Upworthy). Critics say the social network does this by shaping our perception of the world with its algorithmically-filtered newsfeed. Facebook, however, has come out with a study that it says proves this isn’t true—if there is a filter bubble, the company says, it exists because users choose to see certain things, not because of Facebook’s algorithmic filters.

But is this really what the study proves? There’s considerable debate about that among social scientists knowledgeable in the field, who note that the conclusions Facebook wants us to draw—by saying, for example, that the study “establishes that … individual choices matter more than algorithms”—aren’t necessarily supported by the evidence actually provided in the paper.

For one thing, these researchers point out that the study only looked at a tiny fraction of the total Facebook user population: less than 4% of the overall user base, in fact (a number which doesn’t appear in the study itself but is only mentioned in an appendix). That’s because the study group was selected only from those users who specifically mention their political affiliation. Needless to say, extrapolating from that to the entire 1.2 billion-user Facebook universe is a huge leap.

Sociologist Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than algorithms, it doesn’t actually back this up. In fact, while that appears to be the case for conservative users, in the case of users who identified themselves as liberals, Facebook’s own data shows that exposure to different ideological views is reduced more by the algorithm (8%) than it is by a user’s personal choice.

But even that’s not the biggest problem, Jurgenson and others say. The biggest issue is that the Facebook study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so. The study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is ultimately based on:

“Individual users choosing news they agree with and Facebook’s algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend… what’s important is the finding that [the newsfeed] algorithm exacerbates and furthers this filter bubble.”

Sociologist and social-media expert Zeynep Tufekci points out in a post on Medium that trying to separate and compare these two things represents the worst “apples to oranges comparison I’ve seen recently,” since the two things that Facebook is pretending are unrelated have significant cumulative effects, and in fact are tied directly to each other. In other words, Facebook’s algorithmic filter magnifies the already human tendency to avoid news or opinions that we don’t agree with.

“Comparing the individual choice to algorithmic suppression is like asking about the amount of trans fatty acids in french fries, a newly-added ingredient to the menu, and being told that hamburgers, which have long been on the menu, also have trans-fatty acids.”

Christian Sandvig, an associate professor at the University of Michigan, calls the Facebook research the “not our fault” study, since it is clearly designed to absolve the social network of blame for people not being exposed to contrary news and opinion. In addition to the framing of the research — which tries to claim that being exposed to differing opinions isn’t necessarily a positive thing for society — the conclusion that user choice is the big problem just doesn’t ring true, says Sandvig (who has written a paper about the biased nature of Facebook’s algorithm).

“The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking…. there is no scenario in which user choices vs. the algorithm can be traded off, because they happen together. Users select from what the algorithm already filtered for them. It is a sequence.”

Jurgenson also talks about this, and about how Facebook’s attempt to argue that its algorithm is somehow unbiased or neutral ­— and that the big problem is what users decide to click on and share — is disingenuous. The whole reason why some (including Tufekci, who has written about this before) are so concerned about algorithmic filtering is that users’ behavior is ultimately determined by that filtering. The two processes are symbiotic, so arguing that one is worse than the other makes no sense.

In other words, not only does the study not actually prove what it claims to prove, but the argument that the site is making in defense of its algorithm also isn’t supported by the facts—and in fact, can’t actually be proven by the study as it currently exists. And as Eli Pariser points out in his piece on Medium about the research, the study also can’t be reproduced (a crucial element of any scientific research) because the only people who are allowed access to the necessary data are researchers who work for Facebook.

Correction, May 8, 2015: An earlier version of this post misidentified Christian Sandvig. He is an associate professor at the University of Michigan.

About the Author
By Mathew Ingram
See full bioRight Arrow Button Icon

Latest in Tech

Sarandos
Arts & EntertainmentM&A
It’s a sequel, it’s a remake, it’s a reboot: Lawyers grow wistful for old corporate rumbles as Paramount, Netflix fight for Warner
By Nick LichtenbergDecember 13, 2025
3 hours ago
Oracle chairman of the board and chief technology officer Larry Ellison delivers a keynote address during the 2019 Oracle OpenWorld on September 16, 2019 in San Francisco, California.
AIOracle
Oracle’s collapsing stock shows the AI boom is running into two hard limits: physics and debt markets
By Eva RoytburgDecember 13, 2025
4 hours ago
robots
InnovationRobots
‘The question is really just how long it will take’: Over 2,000 gather at Humanoids Summit to meet the robots who may take their jobs someday
By Matt O'Brien and The Associated PressDecember 12, 2025
17 hours ago
Man about to go into police vehicle
CryptoCryptocurrency
Judge tells notorious crypto scammer ‘you have been bitten by the crypto bug’ in handing down 15 year sentence 
By Carlos GarciaDecember 12, 2025
18 hours ago
three men in suits, one gesturing
AIBrainstorm AI
The fastest athletes in the world can botch a baton pass if trust isn’t there—and the same is true of AI, Blackbaud exec says
By Amanda GerutDecember 12, 2025
18 hours ago
Brainstorm AI panel
AIBrainstorm AI
Creative workers won’t be replaced by AI—but their roles will change to become ‘directors’ managing AI agents, executives say
By Beatrice NolanDecember 12, 2025
19 hours ago

Most Popular

placeholder alt text
Economy
Tariffs are taxes and they were used to finance the federal government until the 1913 income tax. A top economist breaks it down
By Kent JonesDecember 12, 2025
1 day ago
placeholder alt text
Success
Apple cofounder Ronald Wayne sold his 10% stake for $800 in 1976—today it’d be worth up to $400 billion
By Preston ForeDecember 12, 2025
23 hours ago
placeholder alt text
Success
40% of Stanford undergrads receive disability accommodations—but it’s become a college-wide phenomenon as Gen Z try to succeed in the current climate
By Preston ForeDecember 12, 2025
22 hours ago
placeholder alt text
Economy
For the first time since Trump’s tariff rollout, import tax revenue has fallen, threatening his lofty plans to slash the $38 trillion national debt
By Sasha RogelbergDecember 12, 2025
18 hours ago
placeholder alt text
Economy
The Fed just ‘Trump-proofed’ itself with a unanimous move to preempt a potential leadership shake-up
By Jason MaDecember 12, 2025
16 hours ago
placeholder alt text
Success
At 18, doctors gave him three hours to live. He played video games from his hospital bed—and now, he’s built a $10 million-a-year video game studio
By Preston ForeDecember 10, 2025
3 days ago
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.