Why Facebook’s algorithm matters: because 60% of millennials get news there
Younger Internet users like to joke about how Facebook “is the new TV,” but in the case of political news consumption that appears to be literally true, according to a new study from the Pew Research Center for Media and Journalism. More than 60% of millennials who were surveyed said that during the previous week they got their political news from Facebook, compared with 37% who got it from TV.
For older members of the “Baby Boom” generation, meanwhile, those figures were almost exactly reversed: about 60 percent of that age group said they got most of their political news from local television, with about 39% saying they got it from Facebook (only 14% of millennials said they got political news from Twitter).
If nothing else, the survey provides more evidence that political activists and sociologists are right to be concerned about the influence that the Facebook news-feed algorithm can have on the news that users are exposed to. Even small tweaks to the Facebook ecosystem and the types of content that are shown in the feed could have a huge effect on a large segment of the population, and in largely unexpected ways.
The sources of political news that millennial users cited as the most common after the giant social network were CNN, local TV stations and Google News, followed by ABC, Fox, NBC and Yahoo. Those from “Generation X” also gave Facebook as the place where they most often get their news, with 51% giving it as the number one source, followed by local TV stations and then CNN. Google News ranked far higher for millennial audiences than it did for either Gen X or the Baby Boomers.
Facebook came under fire recently for a study that it funded, done by a number of in-house scientists, which looked at whether news-feed users were subjected to differing political points of view. Although the study said that the decisions of users themselves determined how much they were exposed to different points of view, a number of experts took issue with that explanation.
These experts pointed out that Facebook’s own data confirmed that for one test group, the algorithmically filtered news-feed did affect the amount of alternative political commentary and news they were exposed to. But even more important than that, Facebook’s study pretended that a user’s experience on the site could be looked at separately from the functioning of the algorithm, when the two are so closely linked that it’s almost impossible to separate them.
One prominent critic of Facebook’s algorithmic filtering, sociologist Zeynep Tufekci of the University of North Carolina, pointed out in a piece published on Medium that the way an algorithm functions can have real-world consequences—as it arguably did during the violence in Ferguson, Mo. after a black man was shot by police even though he was unarmed. Many Facebook users didn’t see that news, but instead saw innocuous videos of celebrities taking the “Ice Bucket Challenge.”
Facebook has also been criticized in the past for removing content that breaches its internal standards, even if that content has a news or public-policy value to it. So, for example, investigative journalist Eliot Higgins complained that the site removed pages belonging to Syrian rebel groups, even though some of those pages were the only source of important information about the ongoing civil war in that country.
What are the long-term implications of Facebook’s influence over the news diet of younger users? We don’t know. Which is why journalists like George Brock of City University in London have tried to challenge the giant social network to take some responsibility for how its algorithm helps shape a user’s view of the world. And it’s even more reason to be concerned that some media companies are handing over their content to Facebook, putting them at the mercy of that algorithm.