Photograph by Getty Images
By David Z. Morris
March 11, 2018

Search for a political topic on YouTube, and you’re likely to be nudged to watch increasingly extreme, misleading, or outright false content on that topic. If your interest is in left-wing topics, the site’s algorithm will point you towards corresponding left-wing conspiracy theories. The reverse goes for searches on conservative-leaning topics.

That, at least, was the conclusion of communications researcher Zeynep Tufecki after an informal experiment described Saturday in the New York Times. Tufecki’s exercise was unscientific, and she writes that Google doesn’t like sharing hard data with researchers. But a Wall Street Journal investigation came to similar conclusions last month, with the help of a former YouTube engineer.

Similar to recent research showing that fake news spreads faster than facts on Twitter, these findings about YouTube’s algorithm can’t be blamed on any nefarious plot to destabilize the world. Instead, the problem seems inherent to the intersection of human nature and YouTube’s business model.

Get Data Sheet, Fortune’s technology newsletter.

Just like Facebook, Twitter, and good old television, YouTube makes money from ads — and therefore, from audiences’ attention. Over time, Tufecki writes, YouTube’s “algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.” But YouTube, which like Facebook and Twitter would rather be seen as neutral “platforms” rather than publishers, doesn’t take the same kind of responsibility for its content that a television broadcaster is required to.

If that’s true, even stricter enforcement action on individual videos — such as YouTube’s recent decision to reprimand conspiracy theorist Alex Jones — amounts to little more than a game of whack-a-mole.

As the Journal pointed out in its report last month, it clearly doesn’t have to be this way. YouTube parent company Google weighs the trustworthiness of content when returning web search results, giving priority to mainstream news sites. YouTube has chosen not to implement anything so straightforward. YouTube told the Journal that it had a harder task than Google because of the smaller selection of videos about breaking news events, as compared to written stories.

And of course, YouTube didn’t create political divisiveness. To claim so would be to ignore longer-term trends, including extreme gerrymandering and the rise of political dark money that make elected officials more likely to appeal to extremes.

But YouTube and other digital platforms may very well be making those problems worse, and fast.


You May Like