Even as Facebook tries to convince news publishers like the New York Times to publish directly on its platform—instead of just posting excerpts with links to their websites—the company continues to demonstrate why that is such a Faustian bargain. On Tuesday, for example, the social-networking behemoth announced some new tweaks to its news-feed algorithm, and warned that publishers might see a decline in "post reach and referral traffic" as a result.
In its post about the new changes, Facebook (fb) tried to soften the blow by pointing out that referral traffic to media publishers has more than doubled in the past 18 months, and that it is always trying to help publishers find the right audience for their content by "optimizing how it is discovered and consumed." The problem, of course, is that no one really knows what Facebook means by terms like "optimization." Does it mean choosing the most high-quality content? Showing users what they want? Some combination of both? It's unclear.
What is clear is that news publishers—and media companies of all kinds—have no real choice when it comes to dealing with Facebook, regardless of the terms of engagement. The social network is one of the largest digital platforms in existence, with a global audience of more than 1.2 billion, and it is also the way in which a majority of younger users find their news. Choosing to avoid Facebook simply isn't an option if you want your content to be found.
Now you see it, now you don't
Unfortunately, how Facebook feels about your content can differ from one moment to the next. Fans of the social-gaming company Zynga (znga) know this all too well: Games like FarmVille were once worth hundreds of millions of dollars because they were promoted by Facebook, but their vast audience disintegrated almost overnight when the social platform changed its algorithm. What's ironic about the company's latest negotiations with publishers is that news companies got much the same treatment not long ago. Several outlets created "social reader" applications that built up millions of readers until the social platform changed its mind again and downgraded their content.
On top of such overt moves, Facebook is continually shaping the news that users see, by removing or hiding content that it believes might offend them, including videos and even entire Facebook pages about the conflict in Syria—which investigative journalists like Eliot Higgins argue are a crucial record of events in those countries.
All of this is frustrating enough for news companies, but what's even more frustrating is that Facebook continues to pretend that it doesn't influence the discovery of news or how it is consumed, even though that's exactly what it does (as it admits in the announcement about its latest update). And that kind of influence can have a profound impact on how users see the world.
A double-edged sword for news
In a speech at the International Journalism Festival in Italy last week, Facebook's director of news partnerships, Andy Mitchell, rejected the idea that the social platform is some kind of gatekeeper when it comes to the discovery of news, saying Facebook doesn't control the news-feed -- users control it by telling Facebook what they are interested in. In other words, Facebook sees itself as merely reflecting the desires of its users. Journalism professor George Brock, who challenged Mitchell at the event, said in a blog post:
"For the senior news guy with such gatekeeper and distribution power to evade these questions is condescending and dishonest. Facebook is not, and knows quite well it is not, a neutral machine passing on news. Its algorithm chooses what people see, it has community standards that material must meet and it has to operate within the laws of many countries."
Despite the protests from Mitchell and others that Facebook doesn't manipulate its News Feed, founder and CEO Mark Zuckerberg and the company's vice president of product Chris Cox have repeatedly talked about wanting the News Feed to show users "high-quality" news and other content, and they have made a number of changes to de-emphasize or hide clickbait-style content and promote pieces they feel are more worthwhile.
The bottom line is that the giant social platform wants to have its cake and eat it too: it wants to tweak the news-feed in order to promote content that serves its purposes—whether that's news content or baby pictures—but it also wants to pretend that it isn't a gatekeeper, because then media companies might not play ball. So it tries to portray the algorithm as just a harmless extension of its users' interests, when in fact it is anything but. It is Facebook's most powerful weapon, and a blade that cuts both ways when it comes to the media industry.